OpenTelemetry OpenAI Instrumentation (v2)

2.3b0 · active · verified Sun Apr 12

This library provides official OpenTelemetry instrumentation for the OpenAI Python API library (version 1.0.0 and above). It enables automatic tracing of LLM requests, capturing model name, token usage, finish reason, duration, and errors without modifying existing OpenAI client code. It also supports logging of messages and metrics, and is maintained as part of the OpenTelemetry Python Contrib project.

Warnings

Install

Imports

Quickstart

This quickstart demonstrates how to set up OpenTelemetry with the OpenAI instrumentation. It initializes a `TracerProvider` with a `ConsoleSpanExporter` to print traces to the console, then calls `OpenAIInstrumentor().instrument()` to automatically trace interactions with the OpenAI Python client (v1.0.0+). An example OpenAI chat completion call is included.

import os
from opentelemetry import trace
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import ConsoleSpanExporter, SimpleSpanProcessor
from opentelemetry.instrumentation.openai_v2 import OpenAIInstrumentor
from openai import OpenAI

# Configure OpenTelemetry Tracer Provider
def setup_and_instrument_otel():
    resource = Resource.create({"service.name": "my-openai-app"})
    provider = TracerProvider(resource=resource)
    processor = SimpleSpanProcessor(ConsoleSpanExporter())
    provider.add_span_processor(processor)
    trace.set_tracer_provider(provider)

    # Instrument OpenAI
    OpenAIInstrumentor().instrument()
    print("OpenTelemetry and OpenAI instrumentation initialized.")


if __name__ == "__main__":
    # Set OpenAI API key from environment variable
    os.environ['OPENAI_API_KEY'] = os.environ.get('OPENAI_API_KEY', 'sk-YOUR_OPENAI_API_KEY')
    if not os.environ['OPENAI_API_KEY'] or os.environ['OPENAI_API_KEY'] == 'sk-YOUR_OPENAI_API_KEY':
        print("Please set the OPENAI_API_KEY environment variable.")
        exit(1)

    setup_and_instrument_otel()

    client = OpenAI()

    try:
        print("\nMaking an OpenAI chat completion call...")
        chat_completion = client.chat.completions.create(
            model="gpt-3.5-turbo",
            messages=[
                {"role": "user", "content": "Tell me a short story about OpenTelemetry."}
            ]
        )
        print("OpenAI call successful. Check console for traces.")
        # print(chat_completion.choices[0].message.content)
    except Exception as e:
        print(f"An error occurred during OpenAI call: {e}")

    # To see traces, you'd typically export to a collector, e.g., Jaeger, instead of ConsoleSpanExporter.
    # This example prints to console to show basic functionality.

view raw JSON →