{"id":9153,"library":"openinference-instrumentation-bedrock","title":"OpenInference Bedrock Instrumentation","description":"The `openinference-instrumentation-bedrock` library provides automatic instrumentation for the AWS Bedrock client (`boto3`), enabling OpenTelemetry-compliant observability for applications utilizing Bedrock foundation models. It captures detailed traces of LLM invocations and interactions, which can be sent to any OpenTelemetry-compatible backend, such as Arize AI's Phoenix platform. The library is actively maintained by Arize AI, with frequent updates across its various instrumentation packages, as evidenced by its rapid minor version releases. [1, 3, 4, 11, 17]","status":"active","version":"0.1.34","language":"en","source_language":"en","source_url":"https://github.com/Arize-ai/openinference/tree/main/python/instrumentation/openinference-instrumentation-bedrock","tags":["observability","opentelemetry","bedrock","aws","llm","instrumentation","tracing","arize"],"install":[{"cmd":"pip install openinference-instrumentation-bedrock boto3","lang":"bash","label":"Install library and AWS Boto3"}],"dependencies":[{"reason":"Required for interacting with AWS Bedrock services.","package":"boto3","optional":false},{"reason":"Specific versions (e.g., >=1.34.116) are required for certain Bedrock APIs like 'converse'.","package":"botocore","optional":false},{"reason":"Core OpenTelemetry API for tracing (installed as a dependency).","package":"opentelemetry-api","optional":false},{"reason":"OpenTelemetry SDK for trace management (installed as a dependency).","package":"opentelemetry-sdk","optional":false}],"imports":[{"symbol":"BedrockInstrumentor","correct":"from openinference.instrumentation.bedrock import BedrockInstrumentor"}],"quickstart":{"code":"import os\nimport boto3\nfrom opentelemetry import trace\nfrom opentelemetry.sdk.resources import Resource\nfrom opentelemetry.sdk.trace import TracerProvider\nfrom opentelemetry.sdk.trace.export import SimpleSpanProcessor, ConsoleSpanExporter\nfrom openinference.instrumentation.bedrock import BedrockInstrumentor\n\n# 1. Configure OpenTelemetry Tracer Provider\nresource = Resource.create({\"service.name\": \"my-bedrock-app\"})\ntracer_provider = TracerProvider(resource=resource)\ntracer_provider.add_span_processor(\n    SimpleSpanProcessor(ConsoleSpanExporter())\n)\ntrace.set_tracer_provider(tracer_provider)\n\n# 2. Instrument the Bedrock client\nBedrockInstrumentor().instrument()\n\n# 3. Create a boto3 client (must be after instrumentation)\n# Ensure AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, AWS_REGION_NAME are set in environment\n# or configure boto3 credentials separately.\nbedrock_runtime = boto3.client(\n    service_name='bedrock-runtime',\n    region_name=os.environ.get('AWS_REGION_NAME', 'us-east-1'),\n    aws_access_key_id=os.environ.get('AWS_ACCESS_KEY_ID', 'YOUR_AWS_ACCESS_KEY_ID'),\n    aws_secret_access_key=os.environ.get('AWS_SECRET_ACCESS_KEY', 'YOUR_AWS_SECRET_ACCESS_KEY')\n)\n\n# 4. Invoke a Bedrock model\nmodel_id = \"anthropic.claude-instant-v1\"\ncontent_type = \"application/json\"\naccept_type = \"application/json\"\n\nbody = {\n    \"prompt\": \"Human: What is the capital of France? Assistant:\",\n    \"max_tokens_to_sample\": 100,\n    \"temperature\": 0.5,\n}\n\ntry:\n    response = bedrock_runtime.invoke_model(\n        body=str(body),\n        modelId=model_id,\n        contentType=content_type,\n        accept=accept_type\n    )\n    response_body = response['body'].read().decode('utf-8')\n    print(f\"Model Response: {response_body}\")\nexcept Exception as e:\n    print(f\"Error invoking model: {e}\")\n    print(\"Please ensure your AWS credentials are configured and Bedrock access is granted.\")\n\n# Spans will be printed to console by ConsoleSpanExporter","lang":"python","description":"This quickstart demonstrates how to set up `openinference-instrumentation-bedrock` to automatically trace calls to AWS Bedrock. It configures a basic OpenTelemetry `TracerProvider` with a `ConsoleSpanExporter` to print traces to the console, instruments the `boto3` Bedrock client, and then makes a sample `invoke_model` call. Ensure your AWS credentials (AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, AWS_REGION_NAME) are set as environment variables or configured in your AWS setup for `boto3` to work correctly. [1, 3, 4, 13]"},"warnings":[{"fix":"Use synchronous `boto3` for Bedrock interactions if complete LLM metadata tracing is critical, or manually instrument `aioboto3` calls by setting OpenInference semantic attributes directly within custom OpenTelemetry spans. [8]","message":"Tracing of LLM-specific metadata (prompts, responses, token usage) is not fully supported for asynchronous Bedrock calls made with `aioboto3`. While spans might be generated, they often lack rich LLM attributes. [8]","severity":"gotcha","affected_versions":"All versions up to 0.1.34"},{"fix":"Prefer the `boto3` Bedrock `converse` API over `invoke_model` for Meta models when comprehensive tracing is required. Ensure `botocore` version is >= 1.34.116 for `converse` API support. [3, 4, 13]","message":"When using Meta models (e.g., Llama 3) on Amazon Bedrock, outputs might not be traced when using the `invoke_model` API. It is recommended to use the `converse` API for these models to ensure full tracing. [4, 13]","severity":"gotcha","affected_versions":"All versions up to 0.1.34"},{"fix":"Ensure `BedrockInstrumentor().instrument()` is called early in your application's lifecycle, preferably immediately after configuring your OpenTelemetry `TracerProvider` and before initializing any `boto3` Bedrock clients.","message":"The `BedrockInstrumentor().instrument()` call must occur *before* any `boto3.client('bedrock-runtime')` instances are created. Clients initialized prior to instrumentation will not be traced. [4, 13]","severity":"gotcha","affected_versions":"All versions up to 0.1.34"}],"env_vars":null,"last_verified":"2026-04-16T00:00:00.000Z","next_check":"2026-07-15T00:00:00.000Z","problems":[{"fix":"Switch to synchronous `boto3` client calls, or manually create OpenTelemetry spans around `aioboto3` calls and explicitly set OpenInference semantic attributes like `input.value`, `output.value`, and `llm.model_name`. [8]","cause":"Using `aioboto3` (the async client) for Bedrock interactions. The current instrumentation's monkey-patching on `boto3` internals does not affect `aioboto3` clients. [8]","error":"Spans created but LLM attributes missing (e.g., prompts, responses, model info, token usage) for Bedrock calls."},{"fix":"Call `BedrockInstrumentor().instrument()` before creating any `boto3.client('bedrock-runtime')` instances. Also, ensure `opentelemetry.trace.set_tracer_provider(your_provider)` is called with a correctly configured `TracerProvider` and `SpanProcessor`.","cause":"The `BedrockInstrumentor().instrument()` method was called *after* the `boto3` Bedrock client was initialized, or the OpenTelemetry `TracerProvider` was not properly configured or set globally. [4, 13]","error":"No traces are generated for Bedrock `invoke_model` or `converse` calls."},{"fix":"Ensure your `TracerProvider` includes a `SpanProcessor` that uses an `OTLPSpanExporter` (or another relevant exporter for your backend) and that its configuration (e.g., `endpoint`, `headers`) is correct for your observability platform.","cause":"The OpenTelemetry `TracerProvider` is configured without an appropriate `SpanExporter` (e.g., `OTLPSpanExporter` for remote collectors) or the exporter is misconfigured (e.g., incorrect endpoint).","error":"Traces are generated, but they do not appear in my OpenTelemetry collector or observability platform (e.g., Phoenix, Grafana)."}]}