{"id":2151,"library":"opentelemetry-instrumentation-anthropic","title":"OpenTelemetry Anthropic Instrumentation","description":"This library provides OpenTelemetry instrumentation for the Anthropic Python client library, enabling automatic tracing of Anthropic API calls. It captures prompts, completions, and other relevant metadata as spans, conforming to OpenTelemetry's Generative AI semantic conventions. The library is actively maintained with frequent releases.","status":"active","version":"0.58.0","language":"en","source_language":"en","source_url":"https://github.com/traceloop/openllmetry/tree/main/packages/opentelemetry-instrumentation-anthropic","tags":["opentelemetry","observability","tracing","anthropic","llm","ai","generative-ai"],"install":[{"cmd":"pip install opentelemetry-instrumentation-anthropic anthropic opentelemetry-sdk opentelemetry-exporter-otlp","lang":"bash","label":"Install with core OTel SDK and OTLP exporter"}],"dependencies":[{"reason":"Required for interacting with the Anthropic API.","package":"anthropic","optional":false},{"reason":"Core OpenTelemetry API for defining telemetry.","package":"opentelemetry-api","optional":false},{"reason":"OpenTelemetry SDK for trace provider and span processors.","package":"opentelemetry-sdk","optional":false},{"reason":"Provides Generative AI semantic conventions for LLM tracing.","package":"opentelemetry-semantic-conventions-ai","optional":false}],"imports":[{"symbol":"AnthropicInstrumentor","correct":"from opentelemetry.instrumentation.anthropic import AnthropicInstrumentor"}],"quickstart":{"code":"import os\nfrom anthropic import Anthropic\nfrom opentelemetry.instrumentation.anthropic import AnthropicInstrumentor\nfrom opentelemetry import trace\nfrom opentelemetry.sdk.resources import Resource\nfrom opentelemetry.sdk.trace import TracerProvider\nfrom opentelemetry.sdk.trace.export import BatchSpanProcessor\nfrom opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter\n\n# --- OpenTelemetry Setup ---\n# 1. Configure the OpenTelemetry TracerProvider\nresource = Resource.create({\"service.name\": \"anthropic-llm-app\"})\nprovider = TracerProvider(resource=resource)\ntrace.set_tracer_provider(provider)\n\n# 2. Configure an OTLP exporter to send traces (e.g., to an OTLP collector or a service like SigNoz/Arize)\n# Default OTLP HTTP endpoint is http://localhost:4318/v1/traces\notlp_exporter = OTLPSpanExporter()\nspan_processor = BatchSpanProcessor(otlp_exporter)\nprovider.add_span_processor(span_processor)\n\n# 3. Instrument the Anthropic library\nAnthropicInstrumentor().instrument()\n\n# --- Anthropic API Call ---\nANTHROPIC_API_KEY = os.environ.get(\"ANTHROPIC_API_KEY\", \"YOUR_ANTHROPIC_API_KEY\")\n\nif ANTHROPIC_API_KEY == \"YOUR_ANTHROPIC_API_KEY\":\n    print(\"WARNING: ANTHROPIC_API_KEY not set or placeholder. API calls will fail without a valid key.\")\nelse:\n    try:\n        client = Anthropic(api_key=ANTHROPIC_API_KEY)\n\n        print(\"Making an Anthropic API call...\")\n        response = client.messages.create(\n            model=\"claude-3-opus-20240229\", # Or another suitable model\n            max_tokens=100,\n            messages=[\n                {\"role\": \"user\", \"content\": \"Explain the concept of quantum entanglement in a sentence.\"}\n            ],\n        )\n\n        print(\"Anthropic API call successful.\")\n        print(f\"Response: {response.content[0].text[:50]}...\")\n\n    except Exception as e:\n        print(f\"Error during Anthropic API call: {e}\")\n\n# Ensure all spans are exported before the application exits\nprovider.force_flush()","lang":"python","description":"This quickstart demonstrates how to set up OpenTelemetry with the Anthropic instrumentation, configure an OTLP HTTP exporter, and make a traced Anthropic API call. Ensure `ANTHROPIC_API_KEY` is set in your environment. An OpenTelemetry Collector or compatible backend should be running to receive traces."},"warnings":[{"fix":"Review the OpenTelemetry GenAI Semantic Conventions documentation for the updated attribute naming and structure. Your observability backend might require updates to dashboards or alerts.","message":"Breaking changes were introduced in version 0.54.0 to conform to the OpenTelemetry Generative AI Semantic Conventions (GenAI SemConv). This may alter span attribute names and structure.","severity":"breaking","affected_versions":">=0.54.0"},{"fix":"To disable logging of sensitive content, set the environment variable `TRACELOOP_TRACE_CONTENT` to `false`.","message":"By default, this instrumentation logs the full content of prompts, completions, and embeddings to span attributes. This data may contain sensitive information.","severity":"gotcha","affected_versions":"All versions"},{"fix":"This is a known upstream issue (traceloop/openllmetry#3949). Monitor the project's GitHub for a fix. Consider alternative methods for calculating token usage or use non-streaming calls if accurate token counts are critical for images.","message":"When using `client.messages.stream()` with base64 encoded images, particularly with Langfuse, input token counts might be inflated in traces.","severity":"gotcha","affected_versions":"Reported in 0.57.0 (likely affects similar versions)"},{"fix":"Always initialize a `TracerProvider`, add at least one `SpanProcessor` (e.g., `BatchSpanProcessor`), and configure an appropriate `SpanExporter` (e.g., `OTLPSpanExporter`) before calling `instrument()` on the Anthropic instrumentor.","message":"Instrumentation itself only creates spans. An OpenTelemetry SDK (TracerProvider, SpanProcessor, Exporter) must be explicitly configured to process and send telemetry data to a backend.","severity":"gotcha","affected_versions":"All versions"}],"env_vars":null,"last_verified":"2026-04-09T00:00:00.000Z","next_check":"2026-07-08T00:00:00.000Z"}