OpenInference Agno Instrumentation
OpenInference Agno Instrumentation is a Python auto-instrumentation library designed to trace Agno Agents. It is fully OpenTelemetry-compatible, enabling users to send detailed traces of their AI applications to OpenTelemetry collectors such as Arize Phoenix, Langfuse, or LangSmith. The current version is 0.1.30, with a frequent release cadence tied to the broader OpenInference project.
Common errors
-
from openinference.instrumentation.agno import AgnoInstrumentor ImportError: cannot import name 'AgnoInstrumentor' from 'openinference.instrumentation.agno'
cause The `openinference-instrumentation-agno` package is not installed, or there's a typo in the import statement.fixEnsure the package is correctly installed: `pip install openinference-instrumentation-agno`. Double-check the import statement for typos. -
opentelemetry.sdk.trace.export.NoExportedSpanLimitsError: No spans were exported. Check your exporter configuration and ensure spans are being generated.
cause OpenTelemetry traces are not being exported. This is usually due to an incorrect OTLP endpoint, missing API keys/headers, or the span processor not being correctly added to the `TracerProvider`.fixVerify `OTEL_EXPORTER_OTLP_ENDPOINT` and `OTEL_EXPORTER_OTLP_HEADERS` environment variables are set correctly for your observability backend. Ensure `tracer_provider.add_span_processor(SimpleSpanProcessor(OTLPSpanExporter(...)))` is called before `trace_api.set_tracer_provider()`. -
Error: Context detach error when using OpenTelemetry instrumentation with streaming in Team
cause This specific bug, addressed in version `0.1.30`, occurred in earlier versions when `AgnoInstrumentor` was used with streaming and async streaming in Agno Teams, leading to OpenTelemetry context management issues.fixUpgrade `openinference-instrumentation-agno` to version `0.1.30` or newer. This version includes a fix for context detachment issues in streaming scenarios.
Warnings
- breaking Agno (the underlying framework) version 2.0 introduced a breaking change where `store_history_messages` in `Agent` now defaults to `False`. If your tracing relies on agent history being stored, you must explicitly set `store_history_messages=True` when initializing your Agno agents to ensure full context is available in traces.
- gotcha Older versions of `openinference-instrumentation-agno` sometimes failed to emit `token_count` metrics for LLM calls, which can hinder cost analysis and performance monitoring in observability platforms.
- gotcha When using `arun_stream()` with an async generator, the `output.value` attribute might be dropped from the finished span if the consumer closes the generator immediately after receiving the final `RunOutput`. This could lead to incomplete trace data for streaming interactions.
- gotcha Proper OpenTelemetry setup is crucial. Incorrectly configuring the `TracerProvider`, `SpanProcessor`, or `OTLPSpanExporter` (especially the endpoint and authentication headers) will result in traces not being sent to your observability backend.
Install
-
pip install openinference-instrumentation-agno agno opentelemetry-sdk opentelemetry-exporter-otlp
Imports
- AgnoInstrumentor
from openinference.instrumentation.agno import AgnoInstrumentor
Quickstart
import os
import asyncio
from agno.agent import Agent
from agno.models.openai import OpenAIChat
from agno.tools.duckduckgo import DuckDuckGoTools
from openinference.instrumentation.agno import AgnoInstrumentor
from opentelemetry import trace as trace_api
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import SimpleSpanProcessor
# Configure OpenTelemetry to export traces (e.g., to Arize Phoenix or Langfuse)
# For local Phoenix, run 'phoenix serve' in another terminal.
# For Langfuse/LangSmith, set corresponding environment variables like LANGFUSE_PUBLIC_KEY, LANGFUSE_SECRET_KEY, LANGSMITH_API_KEY, etc.
# Example for a local OTLP endpoint (like Phoenix):
otlp_endpoint = os.environ.get('OTEL_EXPORTER_OTLP_ENDPOINT', 'http://127.0.0.1:6006/v1/traces')
# Configure the tracer provider
tracer_provider = TracerProvider()
tracer_provider.add_span_processor(SimpleSpanProcessor(OTLPSpanExporter(endpoint=otlp_endpoint)))
trace_api.set_tracer_provider(tracer_provider=tracer_provider)
# Instrument Agno
AgnoInstrumentor().instrument()
async def main():
# Create and configure an Agno agent
agent = Agent(
model=OpenAIChat(id=os.environ.get('OPENAI_MODEL_ID', 'gpt-4o-mini')),
tools=[DuckDuckGoTools()],
markdown=True,
debug_mode=True,
)
# Use the agent
print("Agent is running...")
response = await agent.run("What is the capital of France?")
print(f"Agent response: {response.content}")
if __name__ == "__main__":
# Set a dummy OpenAI API key if not already set, for agent initialization
if not os.environ.get('OPENAI_API_KEY'):
os.environ['OPENAI_API_KEY'] = 'sk-DUMMY_KEY_FOR_TESTING'
asyncio.run(main())
print(f"Traces should be sent to {otlp_endpoint}")