{"id":4670,"library":"opentelemetry-instrumentation-openai-v2","title":"OpenTelemetry OpenAI Instrumentation (v2)","description":"This library provides official OpenTelemetry instrumentation for the OpenAI Python API library (version 1.0.0 and above). It enables automatic tracing of LLM requests, capturing model name, token usage, finish reason, duration, and errors without modifying existing OpenAI client code. It also supports logging of messages and metrics, and is maintained as part of the OpenTelemetry Python Contrib project.","status":"active","version":"2.3b0","language":"en","source_language":"en","source_url":"https://github.com/open-telemetry/opentelemetry-python-contrib","tags":["opentelemetry","observability","tracing","openai","llm","ai","genai"],"install":[{"cmd":"pip install opentelemetry-instrumentation-openai-v2 opentelemetry-sdk openai","lang":"bash","label":"Install core and OpenAI client"}],"dependencies":[{"reason":"This instrumentation specifically targets OpenAI Python SDK version 1.0.0 and above, which introduced significant API changes.","package":"openai","optional":false},{"reason":"Core OpenTelemetry SDK for tracing and metrics.","package":"opentelemetry-sdk","optional":false},{"reason":"Core OpenTelemetry API for common interfaces.","package":"opentelemetry-api","optional":false},{"reason":"Provides standard semantic attributes for telemetry data.","package":"opentelemetry-semantic-conventions","optional":false},{"reason":"Utility library for Generative AI related OpenTelemetry instrumentations.","package":"opentelemetry-util-genai","optional":false}],"imports":[{"note":"The `_v2` suffix is crucial. `opentelemetry.instrumentation.openai` refers to an older, community-maintained instrumentation that does not support the OpenAI Python SDK v1.x API.","wrong":"from opentelemetry.instrumentation.openai import OpenAIInstrumentor","symbol":"OpenAIInstrumentor","correct":"from opentelemetry.instrumentation.openai_v2 import OpenAIInstrumentor"}],"quickstart":{"code":"import os\nfrom opentelemetry import trace\nfrom opentelemetry.sdk.resources import Resource\nfrom opentelemetry.sdk.trace import TracerProvider\nfrom opentelemetry.sdk.trace.export import ConsoleSpanExporter, SimpleSpanProcessor\nfrom opentelemetry.instrumentation.openai_v2 import OpenAIInstrumentor\nfrom openai import OpenAI\n\n# Configure OpenTelemetry Tracer Provider\ndef setup_and_instrument_otel():\n    resource = Resource.create({\"service.name\": \"my-openai-app\"})\n    provider = TracerProvider(resource=resource)\n    processor = SimpleSpanProcessor(ConsoleSpanExporter())\n    provider.add_span_processor(processor)\n    trace.set_tracer_provider(provider)\n\n    # Instrument OpenAI\n    OpenAIInstrumentor().instrument()\n    print(\"OpenTelemetry and OpenAI instrumentation initialized.\")\n\n\nif __name__ == \"__main__\":\n    # Set OpenAI API key from environment variable\n    os.environ['OPENAI_API_KEY'] = os.environ.get('OPENAI_API_KEY', 'sk-YOUR_OPENAI_API_KEY')\n    if not os.environ['OPENAI_API_KEY'] or os.environ['OPENAI_API_KEY'] == 'sk-YOUR_OPENAI_API_KEY':\n        print(\"Please set the OPENAI_API_KEY environment variable.\")\n        exit(1)\n\n    setup_and_instrument_otel()\n\n    client = OpenAI()\n\n    try:\n        print(\"\\nMaking an OpenAI chat completion call...\")\n        chat_completion = client.chat.completions.create(\n            model=\"gpt-3.5-turbo\",\n            messages=[\n                {\"role\": \"user\", \"content\": \"Tell me a short story about OpenTelemetry.\"}\n            ]\n        )\n        print(\"OpenAI call successful. Check console for traces.\")\n        # print(chat_completion.choices[0].message.content)\n    except Exception as e:\n        print(f\"An error occurred during OpenAI call: {e}\")\n\n    # To see traces, you'd typically export to a collector, e.g., Jaeger, instead of ConsoleSpanExporter.\n    # This example prints to console to show basic functionality.\n","lang":"python","description":"This quickstart demonstrates how to set up OpenTelemetry with the OpenAI instrumentation. It initializes a `TracerProvider` with a `ConsoleSpanExporter` to print traces to the console, then calls `OpenAIInstrumentor().instrument()` to automatically trace interactions with the OpenAI Python client (v1.0.0+). An example OpenAI chat completion call is included."},"warnings":[{"fix":"Ensure your `openai` package is `openai>=1.0.0`. Use `pip install opentelemetry-instrumentation-openai-v2` and import `OpenAIInstrumentor` from `opentelemetry.instrumentation.openai_v2`. Refer to OpenAI's official v1 migration guide for SDK changes.","message":"This instrumentation (`opentelemetry-instrumentation-openai-v2`) is designed for the OpenAI Python SDK v1.0.0 and above. The OpenAI SDK v1.0.0 introduced extensive breaking changes, including a complete rewrite of the client API. Users migrating from older OpenAI SDK versions (pre-1.0.0) or older OpenTelemetry OpenAI instrumentations must update their code and use this `-v2` package.","severity":"breaking","affected_versions":"< 2.0.0"},{"fix":"Monitor release notes for breaking changes, especially when upgrading between minor beta versions. Configure your OpenTelemetry collector or backend to handle potential changes in attribute names by using schema URL transformations if available, or by adjusting dashboards/alerts.","message":"The package is in beta (`b0`) status, which indicates that API stability is not guaranteed and breaking changes may occur in minor versions without notice. Furthermore, semantic convention updates (e.g., to 1.30.0 as noted in `2.3b0` release) can change attribute names used in telemetry.","severity":"gotcha","affected_versions":">= 2.0.0b0"},{"fix":"To enable content capture for logs, set the environment variable `OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT` to `true`. Be aware of the implications for sensitive data.","message":"Message content, such as prompts, completions, and function arguments/return values, is not captured by default due to privacy and data sensitivity concerns.","severity":"gotcha","affected_versions":"All versions"},{"fix":"Ensure you are using the latest available `opentelemetry-instrumentation-openai-v2` version. Test thoroughly if using `with_raw_response` or streaming functionalities, especially if encountering `AttributeError`.","message":"Early beta versions (including `2.3b0`) included fixes for `AttributeError` when handling `LegacyAPIResponse` (from `with_raw_response`) and crashes with streaming `with_raw_response`. This indicates that using less common or raw response patterns with the OpenAI client might have previously led to instrumentation errors.","severity":"gotcha","affected_versions":"<= 2.3b0"}],"env_vars":null,"last_verified":"2026-04-12T00:00:00.000Z","next_check":"2026-07-11T00:00:00.000Z"}