OpenInference Bedrock Instrumentation

0.1.34 · active · verified Thu Apr 16

The `openinference-instrumentation-bedrock` library provides automatic instrumentation for the AWS Bedrock client (`boto3`), enabling OpenTelemetry-compliant observability for applications utilizing Bedrock foundation models. It captures detailed traces of LLM invocations and interactions, which can be sent to any OpenTelemetry-compatible backend, such as Arize AI's Phoenix platform. The library is actively maintained by Arize AI, with frequent updates across its various instrumentation packages, as evidenced by its rapid minor version releases. [1, 3, 4, 11, 17]

Common errors

Warnings

Install

Imports

Quickstart

This quickstart demonstrates how to set up `openinference-instrumentation-bedrock` to automatically trace calls to AWS Bedrock. It configures a basic OpenTelemetry `TracerProvider` with a `ConsoleSpanExporter` to print traces to the console, instruments the `boto3` Bedrock client, and then makes a sample `invoke_model` call. Ensure your AWS credentials (AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, AWS_REGION_NAME) are set as environment variables or configured in your AWS setup for `boto3` to work correctly. [1, 3, 4, 13]

import os
import boto3
from opentelemetry import trace
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import SimpleSpanProcessor, ConsoleSpanExporter
from openinference.instrumentation.bedrock import BedrockInstrumentor

# 1. Configure OpenTelemetry Tracer Provider
resource = Resource.create({"service.name": "my-bedrock-app"})
tracer_provider = TracerProvider(resource=resource)
tracer_provider.add_span_processor(
    SimpleSpanProcessor(ConsoleSpanExporter())
)
trace.set_tracer_provider(tracer_provider)

# 2. Instrument the Bedrock client
BedrockInstrumentor().instrument()

# 3. Create a boto3 client (must be after instrumentation)
# Ensure AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, AWS_REGION_NAME are set in environment
# or configure boto3 credentials separately.
bedrock_runtime = boto3.client(
    service_name='bedrock-runtime',
    region_name=os.environ.get('AWS_REGION_NAME', 'us-east-1'),
    aws_access_key_id=os.environ.get('AWS_ACCESS_KEY_ID', 'YOUR_AWS_ACCESS_KEY_ID'),
    aws_secret_access_key=os.environ.get('AWS_SECRET_ACCESS_KEY', 'YOUR_AWS_SECRET_ACCESS_KEY')
)

# 4. Invoke a Bedrock model
model_id = "anthropic.claude-instant-v1"
content_type = "application/json"
accept_type = "application/json"

body = {
    "prompt": "Human: What is the capital of France? Assistant:",
    "max_tokens_to_sample": 100,
    "temperature": 0.5,
}

try:
    response = bedrock_runtime.invoke_model(
        body=str(body),
        modelId=model_id,
        contentType=content_type,
        accept=accept_type
    )
    response_body = response['body'].read().decode('utf-8')
    print(f"Model Response: {response_body}")
except Exception as e:
    print(f"Error invoking model: {e}")
    print("Please ensure your AWS credentials are configured and Bedrock access is granted.")

# Spans will be printed to console by ConsoleSpanExporter

view raw JSON →