OpenLLMetry (Standalone Instrumentors)

0.52.5 (versioned in sync with traceloop-sdk) · active · verified Sat Feb 28

OpenLLMetry is a project name, not a single installable PyPI package. It refers to two related but distinct things that must not be confused: (1) The traceloop-sdk — the high-level Traceloop SDK (already documented separately) that wraps OpenLLMetry auto-instrumentation behind Traceloop.init(). (2) Standalone opentelemetry-instrumentation-* packages — individual OTel instrumentors for specific LLM providers/frameworks, published from the traceloop/openllmetry GitHub monorepo. These can be used WITHOUT traceloop-sdk in any existing OpenTelemetry setup. ECOSYSTEM CONFUSION: There is a second, competing instrumentation ecosystem called OpenInference (from Arize-ai/openinference), which publishes openinference-instrumentation-* packages. Both OpenLLMetry and OpenInference instrumentors instrument the same providers (OpenAI, LangChain, etc.), use different span attribute schemas, and route to different preferred backends. They are NOT interchangeable.

Warnings

Install

Imports

Quickstart

TracerProvider must be configured before calling .instrument(). Instrumentors do not create or manage the TracerProvider — that is your responsibility. If you want zero-config setup, use traceloop-sdk instead (it manages the TracerProvider for you).

# Standalone usage — no traceloop-sdk required
# Requires only: opentelemetry-sdk + opentelemetry-exporter-otlp + individual instrumentors

import os
from opentelemetry import trace
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter

# 1. Set up TracerProvider and exporter (any OTLP backend)
tracer_provider = TracerProvider()
exporter = OTLPSpanExporter(
    endpoint='http://localhost:4318/v1/traces',  # Jaeger, Grafana, Phoenix, Datadog, etc.
)
tracer_provider.add_span_processor(BatchSpanProcessor(exporter))
trace.set_tracer_provider(tracer_provider)

# 2. Instrument providers — must call before importing/using the library
from opentelemetry.instrumentation.openai import OpenAIInstrumentor
from opentelemetry.instrumentation.langchain import LangChainInstrumentor

OpenAIInstrumentor().instrument(tracer_provider=tracer_provider)
LangChainInstrumentor().instrument(tracer_provider=tracer_provider)

# 3. Use your libraries normally — spans are captured automatically
import openai
client = openai.OpenAI(api_key=os.environ['OPENAI_API_KEY'])
response = client.chat.completions.create(
    model='gpt-4o',
    messages=[{'role': 'user', 'content': 'Hello!'}]
)

# To suppress prompt/completion content from traces (PII/privacy):
os.environ['TRACELOOP_TRACE_CONTENT'] = 'false'
# Must be set before .instrument() is called

# Using with traceloop-sdk (alternative — SDK manages the provider):
# from traceloop.sdk import Traceloop
# Traceloop.init(app_name='my-app')  # handles all of the above automatically

view raw JSON →