Opik - LLM Observability and Evaluation
Opik, built by Comet, is an open-source platform designed to streamline the entire lifecycle of LLM applications. It provides comprehensive tracing, evaluation, monitoring, and optimization capabilities for large language models and agentic systems, from prototype to production. The current version is 1.11.1 and it is under active development with frequent updates and a community-driven roadmap.
Warnings
- breaking Version 1.7.0 of Opik included important updates and breaking changes, particularly for self-hosted deployments. Users are advised to check the changelog for details.
- gotcha For self-hosted Opik instances, ClickHouse must be configured with cluster macros, even for single-node deployments. Without this, migrations will fail with 'DB::Exception: No macro 'cluster' in config'.
- gotcha When using `opik-optimizer`, the prompt passed to any optimizer must be a `ChatPrompt` object, not a raw messages list.
- gotcha Authentication failures often occur due to incorrect API keys or workspace details. For cloud usage, ensure your API key has Agent Optimizer access.
- deprecated Opik's Helm chart has migrated from Bitnami charts and images to official Docker images. Bitnami's old public images are deprecated and hardened images are now subscription-based. This impacts self-hosted Kubernetes deployments.
Install
-
pip install opik -
pip install opik-optimizer
Imports
- track
from opik import track
- Opik
import opik client = opik.Opik()
- ChatPrompt
from opik_optimizer import ChatPrompt
Quickstart
import opik
import os
# Configure Opik - for Comet.com Cloud
# Replace with your actual API key and workspace, or run `opik configure` in your terminal
opik.configure(
api_key=os.environ.get('OPIK_API_KEY', 'YOUR_OPIK_API_KEY'),
workspace=os.environ.get('OPIK_WORKSPACE', 'YOUR_OPIK_WORKSPACE'),
project_name="my-llm-project"
)
@opik.track
def my_llm_function(user_question: str) -> str:
# Simulate an LLM call or business logic
response = f"Echoing your question: {user_question}"
# Log metadata or tags if needed
opik.set_tags(["example", "basic-tracing"])
opik.log_metadata({"question_length": len(user_question)})
return response
# Run the traced function
result = my_llm_function("What is the capital of France?")
print(f"LLM Function Result: {result}")
# To view traces, run `opik dashboard` or visit your Comet.com Opik dashboard.