OpenInference Agno Instrumentation

0.1.30 · active · verified Thu Apr 16

OpenInference Agno Instrumentation is a Python auto-instrumentation library designed to trace Agno Agents. It is fully OpenTelemetry-compatible, enabling users to send detailed traces of their AI applications to OpenTelemetry collectors such as Arize Phoenix, Langfuse, or LangSmith. The current version is 0.1.30, with a frequent release cadence tied to the broader OpenInference project.

Common errors

Warnings

Install

Imports

Quickstart

This quickstart demonstrates how to instrument an Agno agent using `openinference-instrumentation-agno` and export traces via OpenTelemetry. It sets up a `TracerProvider` with an `OTLPSpanExporter` to send traces to a specified endpoint (e.g., a local Arize Phoenix instance). The `AgnoInstrumentor` is then initialized and used to automatically trace the agent's operations. Ensure `agno`, an LLM provider (like `openai`), and OpenTelemetry exporters are installed, and relevant API keys/endpoints are configured via environment variables.

import os
import asyncio
from agno.agent import Agent
from agno.models.openai import OpenAIChat
from agno.tools.duckduckgo import DuckDuckGoTools
from openinference.instrumentation.agno import AgnoInstrumentor
from opentelemetry import trace as trace_api
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import SimpleSpanProcessor

# Configure OpenTelemetry to export traces (e.g., to Arize Phoenix or Langfuse)
# For local Phoenix, run 'phoenix serve' in another terminal.
# For Langfuse/LangSmith, set corresponding environment variables like LANGFUSE_PUBLIC_KEY, LANGFUSE_SECRET_KEY, LANGSMITH_API_KEY, etc.
# Example for a local OTLP endpoint (like Phoenix):
otlp_endpoint = os.environ.get('OTEL_EXPORTER_OTLP_ENDPOINT', 'http://127.0.0.1:6006/v1/traces')

# Configure the tracer provider
tracer_provider = TracerProvider()
tracer_provider.add_span_processor(SimpleSpanProcessor(OTLPSpanExporter(endpoint=otlp_endpoint)))
trace_api.set_tracer_provider(tracer_provider=tracer_provider)

# Instrument Agno
AgnoInstrumentor().instrument()

async def main():
    # Create and configure an Agno agent
    agent = Agent(
        model=OpenAIChat(id=os.environ.get('OPENAI_MODEL_ID', 'gpt-4o-mini')),
        tools=[DuckDuckGoTools()],
        markdown=True,
        debug_mode=True,
    )
    
    # Use the agent
    print("Agent is running...")
    response = await agent.run("What is the capital of France?")
    print(f"Agent response: {response.content}")

if __name__ == "__main__":
    # Set a dummy OpenAI API key if not already set, for agent initialization
    if not os.environ.get('OPENAI_API_KEY'):
        os.environ['OPENAI_API_KEY'] = 'sk-DUMMY_KEY_FOR_TESTING'
    
    asyncio.run(main())

print(f"Traces should be sent to {otlp_endpoint}")

view raw JSON →