Opik - LLM Observability and Evaluation
Opik, built by Comet, is an open-source platform designed to streamline the entire lifecycle of LLM applications. It provides comprehensive tracing, evaluation, monitoring, and optimization capabilities for large language models and agentic systems, from prototype to production. The current version is 1.11.1 and it is under active development with frequent updates and a community-driven roadmap.
Common errors
-
OPIK_API_KEY is not set
cause This error occurs when the Opik Python SDK attempts to connect to Opik Cloud without a configured API key, which is essential for authentication.fixSet the `OPIK_API_KEY` environment variable or configure the SDK using `opik.configure(api_key="YOUR_API_KEY")` in your code, or run `opik configure` in the terminal. -
TypeError: Limiter.__init__() got an unexpected keyword argument 'raise_when_fail'
cause This error arises from an incompatibility between Opik's optimizer and `pyrate-limiter` version 4.x, as that version removed a legacy flag used by Opik's optimizer.fixPin the `pyrate-limiter` dependency to a 3.x release by running `pip install "pyrate-limiter>=3.0.0,<4.0.0"`. -
ImportError: cannot import name 'ConfigFileSourceMixin' from 'pydantic_settings.sources'
cause This indicates a version incompatibility, usually between the installed `pydantic-settings` library and other dependencies or the Python environment, often seen in Docker images with specific Python versions.fixEnsure your `pydantic-settings` and `pydantic` versions are compatible with `opik` and your Python environment. This might involve pinning specific versions of `pydantic` or `pydantic-settings` to resolve the conflict. -
ValueError: Prompt must be a ChatPrompt object
cause This error occurs when an incorrect type is passed to the `optimize_prompt()` method of Opik's optimizer, which specifically expects a `ChatPrompt` object.fixEnsure you are using the `ChatPrompt` class from `opik_optimizer` to define your prompt object before passing it to `optimize_prompt()`. Example: `from opik_optimizer import ChatPrompt; prompt = ChatPrompt(messages=[...], model="gpt-4")`. -
OPIK: Failed to process CreateSpansBatchMessage.
cause This error typically indicates that the Opik SDK client is unable to send trace data (spans) to the Opik backend, often due to an incorrectly configured `OPIK_URL_OVERRIDE` or the Opik server not running or being inaccessible.fixVerify that the Opik backend server is running and accessible from your environment, and ensure that `OPIK_URL_OVERRIDE` is correctly set to the backend's API endpoint. Running `opik configure` can help set up the correct environment variables.
Warnings
- breaking Version 1.7.0 of Opik included important updates and breaking changes, particularly for self-hosted deployments. Users are advised to check the changelog for details.
- gotcha For self-hosted Opik instances, ClickHouse must be configured with cluster macros, even for single-node deployments. Without this, migrations will fail with 'DB::Exception: No macro 'cluster' in config'.
- gotcha When using `opik-optimizer`, the prompt passed to any optimizer must be a `ChatPrompt` object, not a raw messages list.
- gotcha Authentication failures often occur due to incorrect API keys or workspace details. For cloud usage, ensure your API key has Agent Optimizer access.
- deprecated Opik's Helm chart has migrated from Bitnami charts and images to official Docker images. Bitnami's old public images are deprecated and hardened images are now subscription-based. This impacts self-hosted Kubernetes deployments.
Install
-
pip install opik -
pip install opik-optimizer
Imports
- track
from opik import track
- Opik
import opik client = opik.Opik()
- ChatPrompt
from opik import ChatPrompt
from opik_optimizer import ChatPrompt
Quickstart
import opik
import os
# Configure Opik - for Comet.com Cloud
# Replace with your actual API key and workspace, or run `opik configure` in your terminal
opik.configure(
api_key=os.environ.get('OPIK_API_KEY', 'YOUR_OPIK_API_KEY'),
workspace=os.environ.get('OPIK_WORKSPACE', 'YOUR_OPIK_WORKSPACE'),
project_name="my-llm-project"
)
@opik.track
def my_llm_function(user_question: str) -> str:
# Simulate an LLM call or business logic
response = f"Echoing your question: {user_question}"
# Log metadata or tags if needed
opik.set_tags(["example", "basic-tracing"])
opik.log_metadata({"question_length": len(user_question)})
return response
# Run the traced function
result = my_llm_function("What is the capital of France?")
print(f"LLM Function Result: {result}")
# To view traces, run `opik dashboard` or visit your Comet.com Opik dashboard.