{"id":2171,"library":"opik","title":"Opik - LLM Observability and Evaluation","description":"Opik, built by Comet, is an open-source platform designed to streamline the entire lifecycle of LLM applications. It provides comprehensive tracing, evaluation, monitoring, and optimization capabilities for large language models and agentic systems, from prototype to production. The current version is 1.11.1 and it is under active development with frequent updates and a community-driven roadmap.","status":"active","version":"1.11.1","language":"en","source_language":"en","source_url":"https://github.com/comet-ml/opik","tags":["LLM","observability","evaluation","monitoring","tracing","AI","Comet"],"install":[{"cmd":"pip install opik","lang":"bash","label":"Install core SDK"},{"cmd":"pip install opik-optimizer","lang":"bash","label":"Install optimizer (optional)"}],"dependencies":[{"reason":"Required for advanced prompt and agent optimization features.","package":"opik-optimizer","optional":true},{"reason":"Used by the Opik Optimizer for various LLM providers; requires provider API keys to be set as environment variables.","package":"LiteLLM","optional":true}],"imports":[{"symbol":"track","correct":"from opik import track"},{"symbol":"Opik","correct":"import opik\nclient = opik.Opik()"},{"note":"ChatPrompt is part of the `opik-optimizer` package, not the core `opik` SDK.","wrong":"from opik import ChatPrompt","symbol":"ChatPrompt","correct":"from opik_optimizer import ChatPrompt"}],"quickstart":{"code":"import opik\nimport os\n\n# Configure Opik - for Comet.com Cloud\n# Replace with your actual API key and workspace, or run `opik configure` in your terminal\nopik.configure(\n    api_key=os.environ.get('OPIK_API_KEY', 'YOUR_OPIK_API_KEY'),\n    workspace=os.environ.get('OPIK_WORKSPACE', 'YOUR_OPIK_WORKSPACE'),\n    project_name=\"my-llm-project\"\n)\n\n@opik.track\ndef my_llm_function(user_question: str) -> str:\n    # Simulate an LLM call or business logic\n    response = f\"Echoing your question: {user_question}\"\n    # Log metadata or tags if needed\n    opik.set_tags([\"example\", \"basic-tracing\"])\n    opik.log_metadata({\"question_length\": len(user_question)})\n    return response\n\n# Run the traced function\nresult = my_llm_function(\"What is the capital of France?\")\nprint(f\"LLM Function Result: {result}\")\n\n# To view traces, run `opik dashboard` or visit your Comet.com Opik dashboard.","lang":"python","description":"This quickstart demonstrates how to instrument a Python function with the `@opik.track` decorator to automatically log LLM calls and associated metadata to the Opik platform. It includes configuration for both Comet.com Cloud and an example of setting environment variables for authentication."},"warnings":[{"fix":"Review the official changelog for Opik, especially regarding updates to self-hosting and potential configuration adjustments for client SDKs.","message":"Version 1.7.0 of Opik included important updates and breaking changes, particularly for self-hosted deployments. Users are advised to check the changelog for details.","severity":"breaking","affected_versions":">=1.7.0"},{"fix":"If not using Opik's Helm Chart or Docker Compose, manually add the `{cluster}` macro to your ClickHouse configuration file (e.g., `/etc/clickhouse-server/config.d/macros.xml`) and restart ClickHouse.","message":"For self-hosted Opik instances, ClickHouse must be configured with cluster macros, even for single-node deployments. Without this, migrations will fail with 'DB::Exception: No macro 'cluster' in config'.","severity":"gotcha","affected_versions":"All self-hosted versions"},{"fix":"Import `ChatPrompt` from `opik_optimizer` and wrap your messages list before passing it to an optimizer. Example: `from opik_optimizer import ChatPrompt; prompt = ChatPrompt(messages=[...])`.","message":"When using `opik-optimizer`, the prompt passed to any optimizer must be a `ChatPrompt` object, not a raw messages list.","severity":"gotcha","affected_versions":"All `opik-optimizer` versions"},{"fix":"Re-run `opik configure` from the terminal and confirm your Opik API key and workspace details. Ensure environment variables for LLM providers (e.g., `OPENAI_API_KEY`) are correctly set in the shell where your script runs.","message":"Authentication failures often occur due to incorrect API keys or workspace details. For cloud usage, ensure your API key has Agent Optimizer access.","severity":"gotcha","affected_versions":"All versions"},{"fix":"For upgrades from Opik Helm chart version 1.9.39 or older, set `chartMigration.enabled: true` in your Helm upgrade command for the first migration, then set it back to `false` for subsequent upgrades.","message":"Opik's Helm chart has migrated from Bitnami charts and images to official Docker images. Bitnami's old public images are deprecated and hardened images are now subscription-based. This impacts self-hosted Kubernetes deployments.","severity":"deprecated","affected_versions":"<1.9.39 (for original Bitnami-based deployments)"}],"env_vars":null,"last_verified":"2026-04-09T00:00:00.000Z","next_check":"2026-07-08T00:00:00.000Z"}