{"id":4155,"library":"opentelemetry-instrumentation-openai-agents","title":"OpenTelemetry OpenAI Agents Instrumentation","description":"This library provides official OpenTelemetry instrumentation for the `openai-agents` SDK, converting the rich trace data emitted by the Agents runtime into the GenAI semantic conventions, enriching spans with request/response payload metadata, and recording duration/token usage metrics. It is currently at version `0.58.0` and maintains an active development pace with frequent updates, often related to compliance with evolving OpenTelemetry Generative AI semantic conventions.","status":"active","version":"0.58.0","language":"en","source_language":"en","source_url":"https://github.com/traceloop/openllmetry/tree/main/packages/opentelemetry-instrumentation-openai-agents","tags":["opentelemetry","openai","agents","llm","instrumentation","tracing","observability","genai"],"install":[{"cmd":"pip install opentelemetry-instrumentation-openai-agents openai-agents opentelemetry-sdk","lang":"bash","label":"Install library and core dependencies"}],"dependencies":[{"reason":"The core library being instrumented.","package":"openai-agents"},{"reason":"Core OpenTelemetry API for defining telemetry.","package":"opentelemetry-api"},{"reason":"OpenTelemetry SDK for processing and exporting telemetry.","package":"opentelemetry-sdk"},{"reason":"Provides the GenAI semantic conventions used by this instrumentation.","package":"opentelemetry-semantic-conventions"}],"imports":[{"note":"This is the main class to initialize the OpenAI Agents instrumentation.","symbol":"OpenAIAgentsInstrumentor","correct":"from opentelemetry.instrumentation.openai_agents import OpenAIAgentsInstrumentor"}],"quickstart":{"code":"import os\nfrom agents import Agent, Runner, function_tool\nfrom opentelemetry import trace\nfrom opentelemetry.exporter.otlp.proto.grpc.trace_exporter import OTLPSpanExporter\nfrom opentelemetry.instrumentation.openai_agents import OpenAIAgentsInstrumentor\nfrom opentelemetry.sdk.trace import TracerProvider\nfrom opentelemetry.sdk.trace.export import BatchSpanProcessor\n\ndef configure_otel() -> None:\n    # Configure an OpenTelemetry TracerProvider\n    provider = TracerProvider()\n    processor = BatchSpanProcessor(OTLPSpanExporter())\n    provider.add_span_processor(processor)\n    trace.set_tracer_provider(provider)\n\n    # Instrument OpenAI Agents\n    OpenAIAgentsInstrumentor().instrument(tracer_provider=provider)\n\n@function_tool\ndef get_weather(city: str) -> str:\n    \"\"\"Provides the weather for a given city.\"\"\"\n    return f\"The forecast for {city} is sunny with pleasant temperatures.\"\n\nif __name__ == \"__main__\":\n    # Ensure OTLP collector is running, e.g., with Docker:\n    # docker run -d -p 4317:4317 -p 4318:4318 otel/opentelemetry-collector-contrib\n\n    # Configure OpenTelemetry\n    configure_otel()\n\n    # Example OpenAI Agent usage\n    assistant = Agent(\n        name=\"Travel Concierge\",\n        instructions=\"You are a concise travel concierge.\",\n        tools=[get_weather],\n    )\n\n    print(\"Running agent...\")\n    # Provide dummy API key if required by 'openai-agents', though it might be configured via env vars.\n    # For real use, ensure OpenAI API key is set via environment variable, e.g., OPENAI_API_KEY\n    os.environ['OPENAI_API_KEY'] = os.environ.get('OPENAI_API_KEY', 'sk-dummy-key-for-example')\n\n    result = Runner.run_sync(assistant, \"I'm visiting Barcelona this weekend. How should I pack?\")\n    print(f\"Agent final output: {result.final_output}\")\n    print(\"Traces should now be visible in your configured OpenTelemetry backend.\")\n","lang":"python","description":"This quickstart demonstrates how to set up the OpenTelemetry Python SDK with an OTLP exporter and then enable instrumentation for OpenAI Agents. After running this code, traces generated by the agent's operations will be sent to the configured OpenTelemetry collector. Make sure an OpenTelemetry collector is running and accessible."},"warnings":[{"fix":"Monitor release notes for changes related to GenAI semantic conventions. Consider setting `OTEL_SEMCONV_STABILITY_OPT_IN=gen_ai_latest_experimental` environment variable to opt into the latest experimental conventions, or ensure your telemetry backend is flexible with attribute names.","message":"The OpenTelemetry Generative AI semantic conventions are under active development and may change. Frequent updates in minor versions (e.g., 0.53.x to 0.58.x) often include migrations and adjustments to these conventions, which can alter span attributes or names.","severity":"breaking","affected_versions":"All versions prior to a stable GenAI semantic convention release (currently experimental)"},{"fix":"To disable content capture, set the environment variable `TRACELOOP_TRACE_CONTENT=false` or `OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT=no_content` (or equivalent `OTEL_INSTRUMENTATION_OPENAI_AGENTS_CAPTURE_CONTENT` with `ContentCaptureMode.NO_CONTENT`). Other modes like `span_only`, `event_only`, `span_and_event` are available for granular control.","message":"By default, this instrumentation captures message content (prompts, completions, tool arguments) within span attributes, which can include sensitive user data. This behavior might conflict with privacy requirements or increase trace size significantly.","severity":"gotcha","affected_versions":"All versions"},{"fix":"Ensure you are using the correct instrumentation package for the specific OpenAI library or SDK you are interacting with. For `openai-agents` workflows, use this package. For direct `openai` client calls, use `opentelemetry-instrumentation-openai-v2` (the official OpenTelemetry project package) or `opentelemetry-instrumentation-openai` (Traceloop/OpenLLMetry community package) as appropriate.","message":"This package (`opentelemetry-instrumentation-openai-agents`) specifically instruments the `openai-agents` SDK. There are other separate instrumentations for the general `openai` client (e.g., `opentelemetry-instrumentation-openai` or `opentelemetry-instrumentation-openai-v2`). Using the wrong instrumentation for your OpenAI integration will result in no telemetry.","severity":"gotcha","affected_versions":"All versions"}],"env_vars":null,"last_verified":"2026-04-11T00:00:00.000Z","next_check":"2026-07-10T00:00:00.000Z"}