Traceloop SDK (OpenLLMetry)

raw JSON →
0.52.5 verified Tue May 12 auth: yes python install: stale quickstart: stale

Python SDK for OpenLLMetry — an open-source, OpenTelemetry-based observability library for LLM applications built and maintained by Traceloop. Provides auto-instrumentation for 20+ LLM providers (OpenAI, Anthropic, Gemini, Bedrock, Ollama) and frameworks (LangChain, LlamaIndex, CrewAI), plus manual @workflow/@task/@agent decorators. KEY RELATIONSHIP: traceloop-sdk is the convenience wrapper around OpenLLMetry instrumentation packages. 'openllmetry' is NOT a separate installable package — it's the umbrella project name for the GitHub monorepo. The installable SDK is 'traceloop-sdk'. By default, traces are exported to Traceloop cloud, but the SDK is fully vendor-agnostic via TRACELOOP_BASE_URL — it can export to any OTLP backend (Datadog, Grafana, Jaeger, Honeycomb, etc.) without using Traceloop's platform at all.

pip install traceloop-sdk
error ModuleNotFoundError: No module named 'traceloop'
cause The `traceloop-sdk` Python package has not been installed in the current environment.
fix
pip install traceloop-sdk
error ConnectionRefusedError: [Errno 111] Connection refused
cause The configured OpenTelemetry Collector or OTLP endpoint (e.g., `http://localhost:4318` by default) is not running or is unreachable.
fix
Ensure an OTLP Collector is running and accessible at the specified TRACELOOP_BASE_URL, or verify your TRACELOOP_BASE_URL and TRACELOOP_API_KEY configuration.
error Failed to export spans: HTTP status code: 401
cause The `TRACELOOP_API_KEY` environment variable or `api_key` parameter in `Traceloop.init()` is missing or invalid when exporting traces to an OTLP endpoint that requires authentication (like Traceloop Cloud).
fix
Set the TRACELOOP_API_KEY environment variable with a valid API key, or pass it explicitly to Traceloop.init(api_key='your_api_key').
error traceloop-sdk no traces for openai
cause `Traceloop.init()` was called after the LLM client (e.g., `openai.OpenAI()`, `ChatOpenAI()`) was already initialized, preventing the SDK from correctly patching the library.
fix
Ensure Traceloop.init() is called as early as possible in your application's lifecycle, ideally before any LLM client instances are created or used.
breaking Traceloop.init() MUST be called before importing provider libraries (openai, anthropic, langchain, etc.). The SDK instruments providers via OpenTelemetry monkey-patching applied at init time. If you import openai before calling Traceloop.init(), the OpenAI client is not patched and zero spans are generated — no error, no warning, just missing traces.
fix Always structure imports as: (1) from traceloop.sdk import Traceloop, (2) Traceloop.init(...), (3) import openai / import anthropic / etc. Never import provider libraries at the top of your file before Traceloop.init() runs.
breaking @aworkflow decorator is deprecated and will be removed in a future version. @workflow now handles both sync and async functions — use it for both.
fix Replace all @aworkflow with @workflow. No other change needed — the decorator handles async automatically.
gotcha disable_batch defaults to False (batch mode). In production this is correct — batching reduces overhead. In development/local testing, batches are held and flushed on a timer, so traces may not appear in your backend for several seconds after function completion. Confusingly, the process may exit before the batch flushes, causing traces to vanish entirely.
fix Add disable_batch=True to Traceloop.init() in all dev/test environments. Remove it in production.
gotcha There is no installable package named 'openllmetry' on PyPI. The project is called OpenLLMetry and the GitHub repo is traceloop/openllmetry, but the Python package is 'traceloop-sdk'. pip install openllmetry will fail or install a wrong package. Individual instrumentors (for specific providers without the full SDK) are installable as separate packages: e.g. opentelemetry-instrumentation-openai.
fix pip install traceloop-sdk — not openllmetry. If you only need instrumentors for an existing OTel stack (no Traceloop.init()), install the individual opentelemetry-instrumentation-* packages from the openllmetry repo.
gotcha TRACELOOP_HEADERS uses a non-standard key=value format (not HTTP header format). Example: 'Authorization=Bearer token' NOT 'Authorization: Bearer token'. Using colons instead of equals signs causes the header to not be sent, resulting in 401/403 errors at the OTLP endpoint with no clear error message.
fix Use equals sign: TRACELOOP_HEADERS='Authorization=Bearer <token>'. For multiple headers, comma-separate: 'Authorization=Bearer <token>,x-org-id=123'.
breaking The Traceloop SDK, or one of its core dependencies, requires the 'httpx' library. A 'ModuleNotFoundError: No module named 'httpx'' indicates that 'httpx' was not installed successfully alongside the SDK.
fix Ensure all required dependencies are installed. If 'pip install traceloop-sdk' was not sufficient, try explicitly installing 'httpx' with 'pip install httpx'.
breaking The script failed with a 'ModuleNotFoundError: No module named 'openai''. This indicates that the 'openai' package was not installed in the environment where the script was executed.
fix Ensure that the 'openai' package is installed in your Python environment. You can typically resolve this by running `pip install openai`.
python os / libc status wheel install import disk
3.10 alpine (musl) - - - -
3.10 slim (glibc) - - - -
3.11 alpine (musl) - - - -
3.11 slim (glibc) - - - -
3.12 alpine (musl) - - - -
3.12 slim (glibc) - - - -
3.13 alpine (musl) - - - -
3.13 slim (glibc) - - - -
3.9 alpine (musl) - - 1.88s 93.7M
3.9 slim (glibc) - - 1.55s 93M

Import order is load-bearing. Traceloop.init() must be called before importing provider libraries — the SDK uses OTel monkey-patching which only works if the provider hasn't been imported yet. disable_batch=True is strongly recommended in development.

import os

# CRITICAL: Import and init Traceloop BEFORE importing openai, langchain, etc.
# The SDK patches provider modules at init time — late import means no instrumentation.
from traceloop.sdk import Traceloop
from traceloop.sdk.decorators import workflow, task

# Option A: Export to Traceloop cloud
Traceloop.init(
    app_name='my-llm-app',
    api_key=os.environ['TRACELOOP_API_KEY'],
    disable_batch=True  # disable in dev — sends spans immediately, not in batches
)

# Option B: Export to any OTLP backend (no Traceloop account needed)
# Set via env vars before init:
# TRACELOOP_BASE_URL=http://localhost:4318  (Jaeger, Grafana Agent, etc.)
# TRACELOOP_HEADERS='Authorization=Bearer <token>'
Traceloop.init(app_name='my-llm-app')  # reads env vars automatically

# Provider imports AFTER init
import openai
client = openai.OpenAI()

# Manual tracing with decorators
@workflow(name='answer_question')
def answer_question(question: str) -> str:
    response = client.chat.completions.create(
        model='gpt-4o',
        messages=[{'role': 'user', 'content': question}]
    )
    return response.choices[0].message.content

# @task for sub-steps within a workflow
@task(name='format_prompt')
def format_prompt(question: str) -> str:
    return f'Answer concisely: {question}'

result = answer_question('What is OpenTelemetry?')