OpenLLMetry (Standalone Instrumentors)

raw JSON →
0.52.5 (versioned in sync with traceloop-sdk) verified Tue May 12 auth: no python install: stale quickstart: stale

OpenLLMetry is a project name, not a single installable PyPI package. It refers to two related but distinct things that must not be confused: (1) The traceloop-sdk — the high-level Traceloop SDK (already documented separately) that wraps OpenLLMetry auto-instrumentation behind Traceloop.init(). (2) Standalone opentelemetry-instrumentation-* packages — individual OTel instrumentors for specific LLM providers/frameworks, published from the traceloop/openllmetry GitHub monorepo. These can be used WITHOUT traceloop-sdk in any existing OpenTelemetry setup. ECOSYSTEM CONFUSION: There is a second, competing instrumentation ecosystem called OpenInference (from Arize-ai/openinference), which publishes openinference-instrumentation-* packages. Both OpenLLMetry and OpenInference instrumentors instrument the same providers (OpenAI, LangChain, etc.), use different span attribute schemas, and route to different preferred backends. They are NOT interchangeable.

pip install opentelemetry-instrumentation-openai
error ModuleNotFoundError: No module named 'traceloop.sdk'
cause The main OpenLLMetry SDK package, `traceloop-sdk`, has not been installed in your Python environment.
fix
pip install traceloop-sdk
error Authentication Failed
cause The `TRACELOOP_API_KEY` environment variable is either missing, incorrect, revoked, or not configured properly for your Traceloop project and environment.
fix
Set the TRACELOOP_API_KEY environment variable with a valid API key obtained from your Traceloop dashboard (e.g., export TRACELOOP_API_KEY='your_api_key_here' or in a .env file).
error ModuleNotFoundError: No module named 'opentelemetry.instrumentation.<llm_provider>'
cause The specific OpenTelemetry instrumentation package for the LLM provider or framework you are using (e.g., OpenAI, LangChain, Redis) has not been installed.
fix
Install the relevant instrumentation package, for example: pip install opentelemetry-instrumentation-openai or pip install opentelemetry-instrumentation-langchain.
error ImportError: cannot import name 'OpenAI' from 'openai'
cause This typically occurs due to a naming conflict (e.g., a local `openai.py` file), or an outdated `openai` library version where the `OpenAI` class is not directly exposed for import.
fix
Rename any local file named openai.py to avoid conflicts. Ensure your OpenAI library is updated: pip install --upgrade openai. If using a pre-1.0 OpenAI version, the import syntax differs.
error opentelemetry-sdk <version> requires opentelemetry-api==<version>, but you have opentelemetry-api <other_version> which is incompatible.
cause This indicates a dependency conflict between different `opentelemetry` core packages (like `sdk`, `api`, `semantic-conventions`) and the `opentelemetry-instrumentation-*` packages, which demand specific compatible versions.
fix
Upgrade all opentelemetry related packages together to compatible versions. A common approach is to list them explicitly: pip install --upgrade opentelemetry-sdk opentelemetry-api opentelemetry-semantic-conventions opentelemetry-instrumentation-langchain (adjust package names as needed).
breaking There is no 'openllmetry' package on PyPI. pip install openllmetry fails. The correct packages are individual opentelemetry-instrumentation-* packages (for standalone use) or traceloop-sdk (for managed use). This is the most common mistake from reading OpenLLMetry documentation without checking package names.
fix pip install opentelemetry-instrumentation-openai (or whichever provider you need). Full package list: https://github.com/traceloop/openllmetry/tree/main/packages
breaking Two competing ecosystems publish instrumentors for the same providers under similar-looking package names: opentelemetry-instrumentation-openai (OpenLLMetry/Traceloop, imports from opentelemetry.instrumentation.*) vs openinference-instrumentation-openai (OpenInference/Arize, imports from openinference.instrumentation.*). Mixing instrumentors from both ecosystems in the same app generates spans with conflicting attribute schemas. Most backends are only tuned for one schema.
fix Pick one ecosystem and stick to it. Using Phoenix or Arize AX? Use openinference-instrumentation-* packages. Using Datadog, Traceloop cloud, or a generic OTLP backend? Use opentelemetry-instrumentation-* packages (OpenLLMetry). Never mix both in the same app unless you have a span processor that normalizes the schemas.
breaking Instrumentors must be called BEFORE the target library is imported or used. Calling OpenAIInstrumentor().instrument() after openai has already been imported and used will not retroactively patch existing client instances. New instances created after .instrument() may work, but this is unreliable.
fix Call all .instrument() calls at application startup before any other imports of the target library. Structure: (1) set up TracerProvider, (2) call all .instrument(), (3) import and use provider libraries.
gotcha TRACELOOP_TRACE_CONTENT=false must be set BEFORE .instrument() is called. Setting it after instrumentation has been applied has no effect — the content capture is baked in at patch time.
fix Set os.environ['TRACELOOP_TRACE_CONTENT'] = 'false' at the very top of your entry point, before any instrumentation calls.
gotcha Datadog's LLM Observability supports OpenLLMetry (opentelemetry-instrumentation-*) starting at version 0.47+. It explicitly does NOT support OpenInference (openinference-instrumentation-*). Other backends have different compatibility: Phoenix natively supports OpenInference but can accept OpenLLMetry spans via a span processor bridge.
fix Verify your target backend's supported schema before choosing an ecosystem. Check backend docs for 'OpenLLMetry' vs 'OpenInference' support.
breaking The OpenTelemetry OTLP exporter (e.g., OTLPSpanExporter) is not found. This indicates that the `opentelemetry-exporter-otlp` package (or its specific HTTP/gRPC variant) has not been installed. Without an exporter, traces cannot be sent to an OTLP backend.
fix Install the necessary OpenTelemetry exporter package: `pip install opentelemetry-exporter-otlp` (for gRPC) or `pip install opentelemetry-exporter-otlp-proto-http` (for HTTP/protobuf).
pip install opentelemetry-instrumentation-anthropic
pip install opentelemetry-instrumentation-langchain
pip install opentelemetry-instrumentation-llamaindex
pip install opentelemetry-instrumentation-chromadb opentelemetry-instrumentation-pinecone opentelemetry-instrumentation-weaviate
python os / libc variant status wheel install import disk
3.10 alpine (musl) opentelemetry-instrumentation-anthropic - - - -
3.10 alpine (musl) opentelemetry-instrumentation-chromadb - - - -
3.10 alpine (musl) opentelemetry-instrumentation-langchain - - - -
3.10 alpine (musl) opentelemetry-instrumentation-llamaindex - - - -
3.10 alpine (musl) opentelemetry-instrumentation-openai - - - -
3.10 slim (glibc) opentelemetry-instrumentation-anthropic - - - -
3.10 slim (glibc) opentelemetry-instrumentation-chromadb - - - -
3.10 slim (glibc) opentelemetry-instrumentation-langchain - - - -
3.10 slim (glibc) opentelemetry-instrumentation-llamaindex - - - -
3.10 slim (glibc) opentelemetry-instrumentation-openai - - - -
3.11 alpine (musl) opentelemetry-instrumentation-anthropic - - - -
3.11 alpine (musl) opentelemetry-instrumentation-chromadb - - - -
3.11 alpine (musl) opentelemetry-instrumentation-langchain - - - -
3.11 alpine (musl) opentelemetry-instrumentation-llamaindex - - - -
3.11 alpine (musl) opentelemetry-instrumentation-openai - - - -
3.11 slim (glibc) opentelemetry-instrumentation-anthropic - - - -
3.11 slim (glibc) opentelemetry-instrumentation-chromadb - - - -
3.11 slim (glibc) opentelemetry-instrumentation-langchain - - - -
3.11 slim (glibc) opentelemetry-instrumentation-llamaindex - - - -
3.11 slim (glibc) opentelemetry-instrumentation-openai - - - -
3.12 alpine (musl) opentelemetry-instrumentation-anthropic - - - -
3.12 alpine (musl) opentelemetry-instrumentation-chromadb - - - -
3.12 alpine (musl) opentelemetry-instrumentation-langchain - - - -
3.12 alpine (musl) opentelemetry-instrumentation-llamaindex - - - -
3.12 alpine (musl) opentelemetry-instrumentation-openai - - - -
3.12 slim (glibc) opentelemetry-instrumentation-anthropic - - - -
3.12 slim (glibc) opentelemetry-instrumentation-chromadb - - - -
3.12 slim (glibc) opentelemetry-instrumentation-langchain - - - -
3.12 slim (glibc) opentelemetry-instrumentation-llamaindex - - - -
3.12 slim (glibc) opentelemetry-instrumentation-openai - - - -
3.13 alpine (musl) opentelemetry-instrumentation-anthropic - - - -
3.13 alpine (musl) opentelemetry-instrumentation-chromadb - - - -
3.13 alpine (musl) opentelemetry-instrumentation-langchain - - - -
3.13 alpine (musl) opentelemetry-instrumentation-llamaindex - - - -
3.13 alpine (musl) opentelemetry-instrumentation-openai - - - -
3.13 slim (glibc) opentelemetry-instrumentation-anthropic - - - -
3.13 slim (glibc) opentelemetry-instrumentation-chromadb - - - -
3.13 slim (glibc) opentelemetry-instrumentation-langchain - - - -
3.13 slim (glibc) opentelemetry-instrumentation-llamaindex - - - -
3.13 slim (glibc) opentelemetry-instrumentation-openai - - - -
3.9 alpine (musl) opentelemetry-instrumentation-anthropic - - - -
3.9 alpine (musl) opentelemetry-instrumentation-chromadb - - - -
3.9 alpine (musl) opentelemetry-instrumentation-langchain - - - -
3.9 alpine (musl) opentelemetry-instrumentation-llamaindex - - - -
3.9 alpine (musl) opentelemetry-instrumentation-openai - - - -
3.9 slim (glibc) opentelemetry-instrumentation-anthropic - - - -
3.9 slim (glibc) opentelemetry-instrumentation-chromadb - - - -
3.9 slim (glibc) opentelemetry-instrumentation-langchain - - - -
3.9 slim (glibc) opentelemetry-instrumentation-llamaindex - - - -
3.9 slim (glibc) opentelemetry-instrumentation-openai - - - -

TracerProvider must be configured before calling .instrument(). Instrumentors do not create or manage the TracerProvider — that is your responsibility. If you want zero-config setup, use traceloop-sdk instead (it manages the TracerProvider for you).

# Standalone usage — no traceloop-sdk required
# Requires only: opentelemetry-sdk + opentelemetry-exporter-otlp + individual instrumentors

import os
from opentelemetry import trace
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter

# 1. Set up TracerProvider and exporter (any OTLP backend)
tracer_provider = TracerProvider()
exporter = OTLPSpanExporter(
    endpoint='http://localhost:4318/v1/traces',  # Jaeger, Grafana, Phoenix, Datadog, etc.
)
tracer_provider.add_span_processor(BatchSpanProcessor(exporter))
trace.set_tracer_provider(tracer_provider)

# 2. Instrument providers — must call before importing/using the library
from opentelemetry.instrumentation.openai import OpenAIInstrumentor
from opentelemetry.instrumentation.langchain import LangChainInstrumentor

OpenAIInstrumentor().instrument(tracer_provider=tracer_provider)
LangChainInstrumentor().instrument(tracer_provider=tracer_provider)

# 3. Use your libraries normally — spans are captured automatically
import openai
client = openai.OpenAI(api_key=os.environ['OPENAI_API_KEY'])
response = client.chat.completions.create(
    model='gpt-4o',
    messages=[{'role': 'user', 'content': 'Hello!'}]
)

# To suppress prompt/completion content from traces (PII/privacy):
os.environ['TRACELOOP_TRACE_CONTENT'] = 'false'
# Must be set before .instrument() is called

# Using with traceloop-sdk (alternative — SDK manages the provider):
# from traceloop.sdk import Traceloop
# Traceloop.init(app_name='my-app')  # handles all of the above automatically