OpenTelemetry Anthropic Instrumentation
This library provides OpenTelemetry instrumentation for the Anthropic Python client library, enabling automatic tracing of Anthropic API calls. It captures prompts, completions, and other relevant metadata as spans, conforming to OpenTelemetry's Generative AI semantic conventions. The library is actively maintained with frequent releases.
Warnings
- breaking Breaking changes were introduced in version 0.54.0 to conform to the OpenTelemetry Generative AI Semantic Conventions (GenAI SemConv). This may alter span attribute names and structure.
- gotcha By default, this instrumentation logs the full content of prompts, completions, and embeddings to span attributes. This data may contain sensitive information.
- gotcha When using `client.messages.stream()` with base64 encoded images, particularly with Langfuse, input token counts might be inflated in traces.
- gotcha Instrumentation itself only creates spans. An OpenTelemetry SDK (TracerProvider, SpanProcessor, Exporter) must be explicitly configured to process and send telemetry data to a backend.
Install
-
pip install opentelemetry-instrumentation-anthropic anthropic opentelemetry-sdk opentelemetry-exporter-otlp
Imports
- AnthropicInstrumentor
from opentelemetry.instrumentation.anthropic import AnthropicInstrumentor
Quickstart
import os
from anthropic import Anthropic
from opentelemetry.instrumentation.anthropic import AnthropicInstrumentor
from opentelemetry import trace
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
# --- OpenTelemetry Setup ---
# 1. Configure the OpenTelemetry TracerProvider
resource = Resource.create({"service.name": "anthropic-llm-app"})
provider = TracerProvider(resource=resource)
trace.set_tracer_provider(provider)
# 2. Configure an OTLP exporter to send traces (e.g., to an OTLP collector or a service like SigNoz/Arize)
# Default OTLP HTTP endpoint is http://localhost:4318/v1/traces
otlp_exporter = OTLPSpanExporter()
span_processor = BatchSpanProcessor(otlp_exporter)
provider.add_span_processor(span_processor)
# 3. Instrument the Anthropic library
AnthropicInstrumentor().instrument()
# --- Anthropic API Call ---
ANTHROPIC_API_KEY = os.environ.get("ANTHROPIC_API_KEY", "YOUR_ANTHROPIC_API_KEY")
if ANTHROPIC_API_KEY == "YOUR_ANTHROPIC_API_KEY":
print("WARNING: ANTHROPIC_API_KEY not set or placeholder. API calls will fail without a valid key.")
else:
try:
client = Anthropic(api_key=ANTHROPIC_API_KEY)
print("Making an Anthropic API call...")
response = client.messages.create(
model="claude-3-opus-20240229", # Or another suitable model
max_tokens=100,
messages=[
{"role": "user", "content": "Explain the concept of quantum entanglement in a sentence."}
],
)
print("Anthropic API call successful.")
print(f"Response: {response.content[0].text[:50]}...")
except Exception as e:
print(f"Error during Anthropic API call: {e}")
# Ensure all spans are exported before the application exits
provider.force_flush()