{"id":7470,"library":"openinference-instrumentation-agno","title":"OpenInference Agno Instrumentation","description":"OpenInference Agno Instrumentation is a Python auto-instrumentation library designed to trace Agno Agents. It is fully OpenTelemetry-compatible, enabling users to send detailed traces of their AI applications to OpenTelemetry collectors such as Arize Phoenix, Langfuse, or LangSmith. The current version is 0.1.30, with a frequent release cadence tied to the broader OpenInference project.","status":"active","version":"0.1.30","language":"en","source_language":"en","source_url":"https://github.com/Arize-ai/openinference/tree/main/python/instrumentation/openinference-instrumentation-agno","tags":["AI observability","OpenTelemetry","Agno","LLM tracing","agent framework","instrumentation"],"install":[{"cmd":"pip install openinference-instrumentation-agno agno opentelemetry-sdk opentelemetry-exporter-otlp","lang":"bash","label":"Install core library and OpenTelemetry SDK"}],"dependencies":[{"reason":"The instrumentation targets the Agno agent framework.","package":"agno","optional":false},{"reason":"Provides the core OpenTelemetry SDK components for tracing.","package":"opentelemetry-sdk","optional":false},{"reason":"Enables exporting traces to an OTLP-compatible backend (e.g., Phoenix, Langfuse, LangSmith).","package":"opentelemetry-exporter-otlp","optional":false},{"reason":"Core utilities and helpers for OpenInference instrumentations.","package":"openinference-instrumentation","optional":false},{"reason":"Defines semantic conventions for AI/LLM tracing within OpenInference.","package":"openinference-semantic-conventions","optional":false},{"reason":"Used for Python function wrapping and instrumentation.","package":"wrapt","optional":false},{"reason":"Provides backports of features from future Python typing versions.","package":"typing-extensions","optional":false}],"imports":[{"symbol":"AgnoInstrumentor","correct":"from openinference.instrumentation.agno import AgnoInstrumentor"}],"quickstart":{"code":"import os\nimport asyncio\nfrom agno.agent import Agent\nfrom agno.models.openai import OpenAIChat\nfrom agno.tools.duckduckgo import DuckDuckGoTools\nfrom openinference.instrumentation.agno import AgnoInstrumentor\nfrom opentelemetry import trace as trace_api\nfrom opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter\nfrom opentelemetry.sdk.trace import TracerProvider\nfrom opentelemetry.sdk.trace.export import SimpleSpanProcessor\n\n# Configure OpenTelemetry to export traces (e.g., to Arize Phoenix or Langfuse)\n# For local Phoenix, run 'phoenix serve' in another terminal.\n# For Langfuse/LangSmith, set corresponding environment variables like LANGFUSE_PUBLIC_KEY, LANGFUSE_SECRET_KEY, LANGSMITH_API_KEY, etc.\n# Example for a local OTLP endpoint (like Phoenix):\notlp_endpoint = os.environ.get('OTEL_EXPORTER_OTLP_ENDPOINT', 'http://127.0.0.1:6006/v1/traces')\n\n# Configure the tracer provider\ntracer_provider = TracerProvider()\ntracer_provider.add_span_processor(SimpleSpanProcessor(OTLPSpanExporter(endpoint=otlp_endpoint)))\ntrace_api.set_tracer_provider(tracer_provider=tracer_provider)\n\n# Instrument Agno\nAgnoInstrumentor().instrument()\n\nasync def main():\n    # Create and configure an Agno agent\n    agent = Agent(\n        model=OpenAIChat(id=os.environ.get('OPENAI_MODEL_ID', 'gpt-4o-mini')),\n        tools=[DuckDuckGoTools()],\n        markdown=True,\n        debug_mode=True,\n    )\n    \n    # Use the agent\n    print(\"Agent is running...\")\n    response = await agent.run(\"What is the capital of France?\")\n    print(f\"Agent response: {response.content}\")\n\nif __name__ == \"__main__\":\n    # Set a dummy OpenAI API key if not already set, for agent initialization\n    if not os.environ.get('OPENAI_API_KEY'):\n        os.environ['OPENAI_API_KEY'] = 'sk-DUMMY_KEY_FOR_TESTING'\n    \n    asyncio.run(main())\n\nprint(f\"Traces should be sent to {otlp_endpoint}\")","lang":"python","description":"This quickstart demonstrates how to instrument an Agno agent using `openinference-instrumentation-agno` and export traces via OpenTelemetry. It sets up a `TracerProvider` with an `OTLPSpanExporter` to send traces to a specified endpoint (e.g., a local Arize Phoenix instance). The `AgnoInstrumentor` is then initialized and used to automatically trace the agent's operations. Ensure `agno`, an LLM provider (like `openai`), and OpenTelemetry exporters are installed, and relevant API keys/endpoints are configured via environment variables."},"warnings":[{"fix":"Set `store_history_messages=True` in your Agno agent initialization: `agent = Agent(..., store_history_messages=True)`.","message":"Agno (the underlying framework) version 2.0 introduced a breaking change where `store_history_messages` in `Agent` now defaults to `False`. If your tracing relies on agent history being stored, you must explicitly set `store_history_messages=True` when initializing your Agno agents to ensure full context is available in traces.","severity":"breaking","affected_versions":"Agno >= 2.0"},{"fix":"Upgrade to `openinference-instrumentation-agno` version `0.1.30` or newer to ensure correct `token_count` emission. Always verify expected metrics are present in your observability backend.","message":"Older versions of `openinference-instrumentation-agno` sometimes failed to emit `token_count` metrics for LLM calls, which can hinder cost analysis and performance monitoring in observability platforms.","severity":"gotcha","affected_versions":"<0.1.30"},{"fix":"Upgrade to `openinference-instrumentation-agno` version `0.1.30` or newer, which includes a fix for this bug related to context and streaming output handling. Verify your streaming traces contain complete output data.","message":"When using `arun_stream()` with an async generator, the `output.value` attribute might be dropped from the finished span if the consumer closes the generator immediately after receiving the final `RunOutput`. This could lead to incomplete trace data for streaming interactions.","severity":"gotcha","affected_versions":"<0.1.30"},{"fix":"Refer to the OpenTelemetry Python documentation and your specific backend's (e.g., Arize Phoenix, Langfuse, LangSmith) setup guides for correct `OTEL_EXPORTER_OTLP_ENDPOINT` and `OTEL_EXPORTER_OTLP_HEADERS` environment variables or programmatic configuration.","message":"Proper OpenTelemetry setup is crucial. Incorrectly configuring the `TracerProvider`, `SpanProcessor`, or `OTLPSpanExporter` (especially the endpoint and authentication headers) will result in traces not being sent to your observability backend.","severity":"gotcha","affected_versions":"All"}],"env_vars":null,"last_verified":"2026-04-16T00:00:00.000Z","next_check":"2026-07-15T00:00:00.000Z","problems":[{"fix":"Ensure the package is correctly installed: `pip install openinference-instrumentation-agno`. Double-check the import statement for typos.","cause":"The `openinference-instrumentation-agno` package is not installed, or there's a typo in the import statement.","error":"from openinference.instrumentation.agno import AgnoInstrumentor\nImportError: cannot import name 'AgnoInstrumentor' from 'openinference.instrumentation.agno'"},{"fix":"Verify `OTEL_EXPORTER_OTLP_ENDPOINT` and `OTEL_EXPORTER_OTLP_HEADERS` environment variables are set correctly for your observability backend. Ensure `tracer_provider.add_span_processor(SimpleSpanProcessor(OTLPSpanExporter(...)))` is called before `trace_api.set_tracer_provider()`.","cause":"OpenTelemetry traces are not being exported. This is usually due to an incorrect OTLP endpoint, missing API keys/headers, or the span processor not being correctly added to the `TracerProvider`.","error":"opentelemetry.sdk.trace.export.NoExportedSpanLimitsError: No spans were exported. Check your exporter configuration and ensure spans are being generated."},{"fix":"Upgrade `openinference-instrumentation-agno` to version `0.1.30` or newer. This version includes a fix for context detachment issues in streaming scenarios.","cause":"This specific bug, addressed in version `0.1.30`, occurred in earlier versions when `AgnoInstrumentor` was used with streaming and async streaming in Agno Teams, leading to OpenTelemetry context management issues.","error":"Error: Context detach error when using OpenTelemetry instrumentation with streaming in Team"}]}