{"id":4150,"library":"openinference-instrumentation-openai","title":"OpenInference OpenAI Instrumentation","description":"OpenInference OpenAI Instrumentation is a Python auto-instrumentation library designed for OpenAI's Python SDK. It automatically generates OpenTelemetry-compatible traces from OpenAI API calls, enabling developers to send these traces to an OpenTelemetry collector, such as Arize Phoenix, for observability and analysis. The library is actively maintained with frequent updates across the OpenInference ecosystem.","status":"active","version":"0.1.44","language":"en","source_language":"en","source_url":"https://github.com/Arize-ai/openinference/tree/main/python/instrumentation/openinference-instrumentation-openai","tags":["OpenTelemetry","OpenAI","LLM","observability","tracing","instrumentation","AI"],"install":[{"cmd":"pip install openinference-instrumentation-openai \"openai>=1.26\" opentelemetry-sdk opentelemetry-exporter-otlp arize-phoenix","lang":"bash","label":"Full Quickstart Dependencies"},{"cmd":"pip install openinference-instrumentation-openai","lang":"bash","label":"Minimal Installation"}],"dependencies":[{"reason":"Required to instrument OpenAI API calls. Version `openai>=1.26` is recommended for full functionality, including streaming token counts.","package":"openai","optional":false},{"reason":"Core OpenTelemetry SDK components for creating and processing traces.","package":"opentelemetry-sdk","optional":false},{"reason":"OpenTelemetry API for interacting with the tracing system (implicitly installed with opentelemetry-sdk).","package":"opentelemetry-api","optional":false},{"reason":"OpenTelemetry OTLP exporter for sending traces to a collector (e.g., Phoenix).","package":"opentelemetry-exporter-otlp","optional":false},{"reason":"A recommended OpenTelemetry collector and visualization tool for viewing traces, often used in quickstarts.","package":"arize-phoenix","optional":true}],"imports":[{"symbol":"OpenAIInstrumentor","correct":"from openinference.instrumentation.openai import OpenAIInstrumentor"},{"symbol":"OpenAI Client","correct":"import openai\nclient = openai.OpenAI()"},{"symbol":"OTLPSpanExporter","correct":"from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter"},{"symbol":"TracerProvider","correct":"from opentelemetry.sdk import trace as trace_sdk"},{"symbol":"SimpleSpanProcessor, ConsoleSpanExporter","correct":"from opentelemetry.sdk.trace.export import SimpleSpanProcessor, ConsoleSpanExporter"}],"quickstart":{"code":"import os\nimport openai\nfrom openinference.instrumentation.openai import OpenAIInstrumentor\nfrom opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter\nfrom opentelemetry.sdk import trace as trace_sdk\nfrom opentelemetry.sdk.trace.export import ConsoleSpanExporter, SimpleSpanProcessor\n\n# Set your OpenAI API key from environment variables\nos.environ['OPENAI_API_KEY'] = os.environ.get('OPENAI_API_KEY', 'sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx')\n\n# Configure OpenTelemetry Tracer Provider to send traces to a collector (e.g., Phoenix)\nendpoint = \"http://127.0.0.1:6006/v1/traces\" # Default Phoenix endpoint\ntracer_provider = trace_sdk.TracerProvider()\ntracer_provider.add_span_processor(SimpleSpanProcessor(OTLPSpanExporter(endpoint)))\n# Optionally, also print spans to the console for debugging\ntracer_provider.add_span_processor(SimpleSpanProcessor(ConsoleSpanExporter()))\n\n# Instrument the OpenAI SDK\nOpenAIInstrumentor().instrument(tracer_provider=tracer_provider)\n\nif __name__ == \"__main__\":\n    client = openai.OpenAI()\n    try:\n        response = client.chat.completions.create(\n            model=\"gpt-3.5-turbo\",\n            messages=[{\"role\": \"user\", \"content\": \"Write a haiku about observability.\"}],\n            max_tokens=20,\n            stream=False # Set to True and add stream_options={'include_usage': True} for streaming with token counts\n        )\n        print(\"OpenAI API call successful.\")\n        print(f\"Response: {response.choices[0].message.content}\")\n    except openai.AuthenticationError:\n        print(\"Error: OpenAI API key is missing or invalid. Please set OPENAI_API_KEY.\")\n    except Exception as e:\n        print(f\"An unexpected error occurred: {e}\")","lang":"python","description":"This quickstart demonstrates how to instrument OpenAI API calls using `openinference-instrumentation-openai` and send the resulting traces to an OpenTelemetry collector. It sets up a `TracerProvider` to export traces via HTTP OTLP and then instruments the OpenAI client. You should ensure an OpenTelemetry collector (like Arize Phoenix, running `python -m phoenix.server.main serve`) is running to receive traces. Remember to set your `OPENAI_API_KEY` environment variable."},"warnings":[{"fix":"Ensure you are using a compatible `openai` SDK version as specified in the `openinference-instrumentation-openai` documentation or releases. Upgrade both if experiencing unexpected trace data.","message":"Compatibility with OpenAI SDK versions: The instrumentation might require specific OpenAI SDK versions to correctly handle new output formats or features. For example, `openai>=1.26` is required to capture token counts when using streaming completions with `stream_options={'include_usage': True}`.","severity":"gotcha","affected_versions":"<0.1.44 for some features, generally sensitive to OpenAI SDK updates"},{"fix":"Directly reading `_SUPPRESS_INSTRUMENTATION_KEY` for span creation, rather than using OpenTelemetry's `is_instrumentation_suppressed()` utility, can lead to spans being created even when suppression is intended. Manual intervention or alternative span filtering might be necessary if strict suppression is required.","message":"OpenTelemetry `suppress_instrumentation` context flag is not fully respected by `openinference-instrumentation-openai`.","severity":"gotcha","affected_versions":"All versions up to 0.1.44 (as of Jan 2026)"},{"fix":"After `pip install`, restart the Colab runtime. Alternatively, you can try `OpenAIInstrumentor().instrument(skip_dep_check=True)` as a workaround, though it's generally recommended to restart the runtime.","message":"In certain environments like Google Colab, `openinference-instrumentation-openai` might fail to instrument correctly immediately after `pip install`. A session restart or explicit dependency check bypass might be needed.","severity":"gotcha","affected_versions":"Reported in earlier versions, potentially affects recent ones in similar environments"}],"env_vars":null,"last_verified":"2026-04-11T00:00:00.000Z","next_check":"2026-07-10T00:00:00.000Z"}