byLLM Python Library
byLLM (byllm) is a Python library providing a unified API for interacting with various Large Language Model (LLM) providers like OpenAI, Anthropic, Ollama, and Google Gemini. It simplifies LLM integration, abstracting away provider-specific client libraries and response formats. Currently at version 0.6.4, it is part of the Jaseci ecosystem and undergoes frequent updates, often alongside releases of `jaclang` and `jaseci` itself, targeting Python 3.11+.
Common errors
-
ModuleNotFoundError: No module named 'byllm.providers.openai'
cause The specific import path for a provider class has changed in a new `byllm` version, or the library/provider extra was not installed correctly.fixVerify your installed `byllm` version and consult the latest documentation or the `byllm/providers` directory in the source code for the correct import path. Ensure you've installed the necessary extra (e.g., `pip install byllm[openai]`). -
openai.AuthenticationError: Incorrect API key provided: ****
cause The required API key for the LLM provider (e.g., `OPENAI_API_KEY`) is missing, invalid, or incorrectly configured in environment variables or during `byllm` initialization.fixSet the correct API key as an environment variable (e.g., `export OPENAI_API_KEY='your_secret_key'`) or pass it directly to the provider's constructor (e.g., `OpenAI(api_key="your_secret_key")`). Double-check for typos or expiration. -
AttributeError: 'NoneType' object has no attribute 'text'
cause The LLM API call failed, but the `generate` or `chat` method returned `None` or an empty/malformed response object instead of raising an explicit error. This can be due to network issues, rate limits, an invalid model name, or an issue on the LLM provider's side.fixImplement robust error handling around `generate` and `chat` calls. Check logs for underlying API errors, verify network connectivity, ensure correct model names and parameters are used. Check the LLM provider's status page for outages.
Warnings
- breaking As a pre-1.0.0 library (currently 0.6.4), `byllm`'s API and internal structure are subject to significant changes between minor versions. Upgrading without checking release notes might lead to `ImportError` or `AttributeError` for provider classes or method signatures.
- gotcha `byllm` relies on various upstream LLM client libraries (e.g., `openai`, `anthropic`). If your project uses other packages that require different versions of these underlying clients, dependency conflicts can arise, leading to unexpected behavior or `ImportError`s from the core client libraries.
- gotcha Most LLM providers require an API key for authentication. While `byllm` allows passing `api_key` directly, it primarily expects keys to be set as environment variables (e.g., `OPENAI_API_KEY`, `ANTHROPIC_API_KEY`). Forgetting or misconfiguring these will result in authentication failures.
Install
-
pip install byllm -
pip install byllm[openai,anthropic,ollama]
Imports
- OpenAI
from byllm.providers.openai import OpenAI
- Anthropic
from byllm.providers.anthropic import Anthropic
- Ollama
from byllm.providers.ollama import Ollama
- LLMProvider
from byllm import LLMProvider
from byllm.providers.llm_provider import LLMProvider
Quickstart
import os
from byllm.providers.openai import OpenAI
# Set your OpenAI API key as an environment variable:
# export OPENAI_API_KEY="your_key_here"
openai_key = os.environ.get("OPENAI_API_KEY")
if not openai_key:
print("Warning: OPENAI_API_KEY environment variable not set. Requests will likely fail.")
openai_key = "sk-dummy" # Use a dummy key to allow instantiation
try:
llm = OpenAI(api_key=openai_key, model_name="gpt-3.5-turbo")
# Example: Generate text
prompt = "What is the capital of France?"
response = llm.generate(prompt=prompt, max_tokens=50)
print(f"Generated text: {response.text}\n")
# Example: Chat completion
messages = [
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Tell me a short story about a brave knight."}
]
chat_response = llm.chat(messages=messages, max_tokens=100)
print(f"Chat response: {chat_response.text}\n")
except Exception as e:
print(f"An error occurred: {e}")
print("Please ensure your API key is correct and the model name is valid.")