OpenTelemetry Bedrock Instrumentation
The `opentelemetry-instrumentation-bedrock` library provides automatic instrumentation for AWS Bedrock API calls made using Boto3. It captures telemetry data such as prompts, completions, and embeddings, enabling observability into Large Language Model (LLM) applications within the OpenTelemetry ecosystem. This library is part of the broader OpenLLMetry project and is currently at version 0.58.0, with a frequent release cadence.
Warnings
- breaking The OpenTelemetry Generative AI (GenAI) semantic conventions have undergone significant changes, particularly in attribute naming and structure. Recent versions of `opentelemetry-instrumentation-bedrock` (e.g., 0.53.4, 0.54.0, 0.55.0, 0.57.0) reflect these updates to conform to the latest OTel GenAI semantic conventions (e.g., 0.5.0).
- gotcha By default, this instrumentation logs prompts, completions, and embeddings to span attributes, which can contain sensitive user data. This is done to provide rich observability, but may pose privacy concerns.
- gotcha Ensure your `boto3` and `botocore` library versions are compatible with the specific AWS Bedrock features you intend to use. Older versions might not support newer Bedrock APIs (e.g., `botocore >= 1.34.116` is required for the `converse` API).
Install
-
pip install opentelemetry-instrumentation-bedrock boto3
Imports
- BedrockInstrumentor
from opentelemetry.instrumentation.bedrock import BedrockInstrumentor
Quickstart
import os
import boto3
from opentelemetry import trace
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import ConsoleSpanExporter, SimpleSpanProcessor
from opentelemetry.instrumentation.bedrock import BedrockInstrumentor
# Configure OpenTelemetry SDK
resource = Resource.create({"service.name": "bedrock-app"})
tracer_provider = TracerProvider(resource=resource)
span_processor = SimpleSpanProcessor(ConsoleSpanExporter())
tracer_provider.add_span_processor(span_processor)
trace.set_tracer_provider(tracer_provider)
# Instrument Bedrock
BedrockInstrumentor().instrument()
# Ensure AWS credentials are set up (e.g., via environment variables or AWS CLI config)
# For a runnable example, ensure AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, AWS_REGION are set or mock boto3.
# For this quickstart, we'll use os.environ.get for region, assuming credentials are in place.
aws_region = os.environ.get('AWS_REGION', 'us-east-1')
def invoke_bedrock_model():
try:
bedrock_runtime = boto3.client(
service_name='bedrock-runtime',
region_name=aws_region
)
model_id = 'amazon.titan-text-express-v1'
body = {
'inputText': 'Explain the importance of OpenTelemetry.',
'textGenerationConfig': {
'maxTokenCount': 50,
'stopSequences': [],
'temperature': 0.7,
'topP': 0.9
}
}
response = bedrock_runtime.invoke_model(
body=json.dumps(body),
modelId=model_id,
accept='application/json',
contentType='application/json'
)
response_body = json.loads(response.get('body').read())
print(f"Bedrock Model Response: {response_body['results'][0]['outputText'].strip()}")
except Exception as e:
print(f"Error invoking Bedrock model: {e}")
import json
invoke_bedrock_model()