OpenInference OpenAI Instrumentation
OpenInference OpenAI Instrumentation is a Python auto-instrumentation library designed for OpenAI's Python SDK. It automatically generates OpenTelemetry-compatible traces from OpenAI API calls, enabling developers to send these traces to an OpenTelemetry collector, such as Arize Phoenix, for observability and analysis. The library is actively maintained with frequent updates across the OpenInference ecosystem.
Warnings
- gotcha Compatibility with OpenAI SDK versions: The instrumentation might require specific OpenAI SDK versions to correctly handle new output formats or features. For example, `openai>=1.26` is required to capture token counts when using streaming completions with `stream_options={'include_usage': True}`.
- gotcha OpenTelemetry `suppress_instrumentation` context flag is not fully respected by `openinference-instrumentation-openai`.
- gotcha In certain environments like Google Colab, `openinference-instrumentation-openai` might fail to instrument correctly immediately after `pip install`. A session restart or explicit dependency check bypass might be needed.
Install
-
pip install openinference-instrumentation-openai "openai>=1.26" opentelemetry-sdk opentelemetry-exporter-otlp arize-phoenix -
pip install openinference-instrumentation-openai
Imports
- OpenAIInstrumentor
from openinference.instrumentation.openai import OpenAIInstrumentor
- OpenAI Client
import openai client = openai.OpenAI()
- OTLPSpanExporter
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
- TracerProvider
from opentelemetry.sdk import trace as trace_sdk
- SimpleSpanProcessor, ConsoleSpanExporter
from opentelemetry.sdk.trace.export import SimpleSpanProcessor, ConsoleSpanExporter
Quickstart
import os
import openai
from openinference.instrumentation.openai import OpenAIInstrumentor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk import trace as trace_sdk
from opentelemetry.sdk.trace.export import ConsoleSpanExporter, SimpleSpanProcessor
# Set your OpenAI API key from environment variables
os.environ['OPENAI_API_KEY'] = os.environ.get('OPENAI_API_KEY', 'sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx')
# Configure OpenTelemetry Tracer Provider to send traces to a collector (e.g., Phoenix)
endpoint = "http://127.0.0.1:6006/v1/traces" # Default Phoenix endpoint
tracer_provider = trace_sdk.TracerProvider()
tracer_provider.add_span_processor(SimpleSpanProcessor(OTLPSpanExporter(endpoint)))
# Optionally, also print spans to the console for debugging
tracer_provider.add_span_processor(SimpleSpanProcessor(ConsoleSpanExporter()))
# Instrument the OpenAI SDK
OpenAIInstrumentor().instrument(tracer_provider=tracer_provider)
if __name__ == "__main__":
client = openai.OpenAI()
try:
response = client.chat.completions.create(
model="gpt-3.5-turbo",
messages=[{"role": "user", "content": "Write a haiku about observability."}],
max_tokens=20,
stream=False # Set to True and add stream_options={'include_usage': True} for streaming with token counts
)
print("OpenAI API call successful.")
print(f"Response: {response.choices[0].message.content}")
except openai.AuthenticationError:
print("Error: OpenAI API key is missing or invalid. Please set OPENAI_API_KEY.")
except Exception as e:
print(f"An unexpected error occurred: {e}")