{"id":28,"library":"helicone","title":"Helicone","description":"Open-source LLM observability platform using a proxy-based architecture. Unlike LangSmith or Langfuse, Helicone requires NO Python SDK install for core tracing — it works by routing requests through its AI gateway (https://ai-gateway.helicone.ai) via a base_url override on the OpenAI/Anthropic client. All logging happens at the proxy layer. The 'helicone' PyPI package (helicone-helpers) is a thin optional helper with minimal functionality. Primary integration is via HTTP headers and base_url, not a Python library.","status":"active","version":"helicone-helpers 1.0.3 (optional stub)","language":"python","source_language":"en","source_url":"https://github.com/Helicone/helicone","tags":["helicone","observability","llm-proxy","ai-gateway","llm-monitoring","proxy-based","openai-compatible","caching"],"install":[{"cmd":"# No pip install required for core functionality","lang":"bash","label":"Core tracing — base_url change only"},{"cmd":"pip install helicone-helpers","lang":"bash","label":"Optional thin helper package (minimal utility)"}],"dependencies":[],"imports":[{"note":"There is no helicone.instrument() or @helicone.trace decorator. The integration is entirely at the HTTP transport level.","wrong":"import helicone; helicone.instrument()","symbol":"No SDK import required","correct":"# Just override base_url on your existing OpenAI/Anthropic client"}],"quickstart":{"code":"import os\nimport openai\n\n# Core integration: change base_url, add auth header\n# No pip install needed beyond openai\nclient = openai.OpenAI(\n    api_key=os.environ['OPENAI_API_KEY'],\n    base_url='https://ai-gateway.helicone.ai/openai/v1',\n    default_headers={\n        'Helicone-Auth': f'Bearer {os.environ[\"HELICONE_API_KEY\"]}',\n    }\n)\n\nresponse = client.chat.completions.create(\n    model='gpt-4o',\n    messages=[{'role': 'user', 'content': 'Hello!'}]\n)\n# Request is now logged in your Helicone dashboard\n\n# Add metadata via headers\nclient_with_metadata = openai.OpenAI(\n    api_key=os.environ['OPENAI_API_KEY'],\n    base_url='https://ai-gateway.helicone.ai/openai/v1',\n    default_headers={\n        'Helicone-Auth': f'Bearer {os.environ[\"HELICONE_API_KEY\"]}',\n        'Helicone-User-Id': 'user-123',           # per-user tracking\n        'Helicone-Session-Id': 'session-abc',      # session grouping\n        'Helicone-Cache-Enabled': 'true',          # response caching\n    }\n)\n\n# Anthropic integration\nimport anthropic\nclient_anthropic = anthropic.Anthropic(\n    api_key=os.environ['ANTHROPIC_API_KEY'],\n    base_url='https://ai-gateway.helicone.ai/anthropic',\n    default_headers={\n        'Helicone-Auth': f'Bearer {os.environ[\"HELICONE_API_KEY\"]}',\n    }\n)","lang":"python","description":"Helicone's entire Python integration is a base_url change. No import, no decorator, no SDK call. Features like caching, rate limiting, user tracking, and prompt versioning are all controlled via HTTP headers added to default_headers."},"warnings":[{"fix":"Do not expect a Python instrumentation library. Helicone's Python integration is: (1) change base_url to https://ai-gateway.helicone.ai, (2) add Helicone-Auth header.","message":"There is no meaningful Python package to install. The 'helicone' and 'helicone-helpers' PyPI packages are thin stubs with minimal utility. Most Helicone documentation and features are implemented via base_url + HTTP headers, not Python code. Searching PyPI for 'helicone' and expecting a rich SDK like LangSmith or Langfuse will lead to confusion.","severity":"gotcha","affected_versions":"all"},{"fix":"For data residency requirements, use Helicone self-hosted (Docker/Helm) and point base_url at your own instance. EU users: check if the EU region endpoint satisfies GDPR requirements.","message":"All LLM requests route through Helicone's cloud proxy. This adds ~10ms latency (Cloudflare Workers) and means all prompt/response content passes through Helicone's infrastructure. Not suitable for environments with strict data residency requirements without the self-hosted option.","severity":"gotcha","affected_versions":"all"},{"fix":"Verify the gateway is active by checking your Helicone dashboard after the first request. Confirm base_url is set AND Helicone-Auth header is present.","message":"Helicone headers are silently ignored if misspelled or if the base_url is wrong. For example, passing the Helicone-Auth header but keeping the original OpenAI base_url sends the header directly to OpenAI — no error, no tracing, and OpenAI ignores the unknown header.","severity":"gotcha","affected_versions":"all"},{"fix":"Disable caching in dev/test by omitting the Helicone-Cache-Enabled header or setting it to 'false'. Enable only in production for cost savings.","message":"Prompt caching via Helicone-Cache-Enabled returns cached responses that bypass your LLM provider entirely. In development/testing, this can cause confusing stale results. The cache key is based on the full request body — small prompt changes bypass the cache.","severity":"gotcha","affected_versions":"all"},{"fix":"Ensure the `openai` library is installed by running `pip install openai`.","message":"The `openai` Python package is a core dependency for integrating with OpenAI's API via Helicone. It must be installed separately in your environment.","severity":"breaking","affected_versions":"all"},{"fix":"Install the 'openai' package using pip: `pip install openai`. Ensure all necessary dependencies are installed before running the script.","message":"The script fails with a 'ModuleNotFoundError: No module named 'openai''. This error occurs when the 'openai' Python package, required to interact with the OpenAI API, has not been installed in the environment.","severity":"breaking","affected_versions":"all"}],"env_vars":{"required":[{"name":"HELICONE_API_KEY","note":"Get from helicone.ai/settings/keys. Passed as Helicone-Auth header or embedded in base_url path."}]},"last_verified":"2026-05-12T04:49:18.491Z","next_check":"2026-05-28T00:00:00.000Z","problems":[{"fix":"Verify your request payload, ensure all required HTTP headers (like `Content-Type`, `Helicone-Auth`, `anthropic-version` for Anthropic) are correctly set, and confirm the `base_url` is pointing to the correct Helicone endpoint (e.g., `https://oai.helicone.ai/v1` for OpenAI or `https://anthropic.helicone.ai` for Anthropic).","cause":"This error typically indicates that your API request to Helicone or the underlying LLM provider is malformed, has incorrect headers, or contains an invalid payload format.","error":"400 Bad Request"},{"fix":"Ensure that your `HELICONE_API_KEY` is correctly set and, if using direct header integration, ensure the `Helicone-Auth` header includes the 'Bearer' prefix (e.g., `Helicone-Auth: Bearer YOUR_HELICONE_API_KEY`). Also, verify that your underlying provider's API key (e.g., `OPENAI_API_KEY`) is valid.","cause":"This error signifies that the API keys provided are either missing or invalid, preventing authentication with Helicone or the upstream LLM provider.","error":"401 Unauthorized"},{"fix":"Check the status of your LLM provider and review Helicone's dashboard or logs for more specific error details. Implementing `Helicone-Retry-Enabled: true` in your request headers can help mitigate transient provider issues by automatically retrying failed requests.","cause":"This error indicates a problem on the server-side, either within the Helicone gateway or originating from the upstream LLM provider.","error":"500 Internal Server Error"},{"fix":"For self-hosted Docker deployments, ensure that `SUPABASE_SERVICE_ROLE_KEY` is correctly configured with a real JWT signed with the same secret PostgREST uses, and restart the worker services. Verify network connectivity between your Helicone containers.","cause":"This specific error often occurs in self-hosted Helicone deployments when the worker proxy cannot successfully log data to the Helicone Jawn service due to authentication or network connectivity issues between internal components.","error":"Auth failed! Network connection lost"}],"ecosystem":"pypi","meta_description":null,"install_score":80,"install_tag":"verified","quickstart_score":0,"quickstart_tag":"stale","pypi_latest":null,"install_checks":{"last_tested":"2026-05-12","tag":"verified","tag_description":"installs cleanly on critical runtimes, fast import, recently tested","results":[{"runtime":"python:3.10-alpine","python_version":"3.10","os_libc":"alpine (musl)","variant":"default","exit_code":0,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":null,"mem_mb":null,"disk_size":"73.9M"},{"runtime":"python:3.10-slim","python_version":"3.10","os_libc":"slim (glibc)","variant":"default","exit_code":0,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":null,"mem_mb":null,"disk_size":"145M"},{"runtime":"python:3.11-alpine","python_version":"3.11","os_libc":"alpine (musl)","variant":"default","exit_code":0,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":null,"mem_mb":null,"disk_size":"81.1M"},{"runtime":"python:3.11-slim","python_version":"3.11","os_libc":"slim (glibc)","variant":"default","exit_code":0,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":null,"mem_mb":null,"disk_size":"152M"},{"runtime":"python:3.12-alpine","python_version":"3.12","os_libc":"alpine (musl)","variant":"default","exit_code":0,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":null,"mem_mb":null,"disk_size":"71.4M"},{"runtime":"python:3.12-slim","python_version":"3.12","os_libc":"slim (glibc)","variant":"default","exit_code":0,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":null,"mem_mb":null,"disk_size":"143M"},{"runtime":"python:3.13-alpine","python_version":"3.13","os_libc":"alpine (musl)","variant":"default","exit_code":0,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":null,"mem_mb":null,"disk_size":"67.5M"},{"runtime":"python:3.13-slim","python_version":"3.13","os_libc":"slim (glibc)","variant":"default","exit_code":0,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":null,"mem_mb":null,"disk_size":"141M"},{"runtime":"python:3.9-alpine","python_version":"3.9","os_libc":"alpine (musl)","variant":"default","exit_code":0,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":null,"mem_mb":null,"disk_size":"72.8M"},{"runtime":"python:3.9-slim","python_version":"3.9","os_libc":"slim (glibc)","variant":"default","exit_code":0,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":null,"mem_mb":null,"disk_size":"144M"}]},"quickstart_checks":{"last_tested":"2026-05-11","tag":"stale","tag_description":"widespread failures or data too old to trust","results":[{"runtime":"python:3.10-alpine","exit_code":1},{"runtime":"python:3.10-slim","exit_code":1},{"runtime":"python:3.11-alpine","exit_code":1},{"runtime":"python:3.11-slim","exit_code":1},{"runtime":"python:3.12-alpine","exit_code":1},{"runtime":"python:3.12-slim","exit_code":1},{"runtime":"python:3.13-alpine","exit_code":1},{"runtime":"python:3.13-slim","exit_code":1},{"runtime":"python:3.9-alpine","exit_code":1},{"runtime":"python:3.9-slim","exit_code":1}]}}