OpenTelemetry Bedrock Instrumentation

0.58.0 · active · verified Thu Apr 09

The `opentelemetry-instrumentation-bedrock` library provides automatic instrumentation for AWS Bedrock API calls made using Boto3. It captures telemetry data such as prompts, completions, and embeddings, enabling observability into Large Language Model (LLM) applications within the OpenTelemetry ecosystem. This library is part of the broader OpenLLMetry project and is currently at version 0.58.0, with a frequent release cadence.

Warnings

Install

Imports

Quickstart

This quickstart demonstrates how to set up the OpenTelemetry SDK with a console exporter, instrument the Bedrock client, and then invoke a Bedrock model. The instrumentation automatically creates spans for the Bedrock interaction, which are then printed to the console by the `ConsoleSpanExporter`.

import os
import boto3
from opentelemetry import trace
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import ConsoleSpanExporter, SimpleSpanProcessor
from opentelemetry.instrumentation.bedrock import BedrockInstrumentor

# Configure OpenTelemetry SDK
resource = Resource.create({"service.name": "bedrock-app"})
tracer_provider = TracerProvider(resource=resource)
span_processor = SimpleSpanProcessor(ConsoleSpanExporter())
tracer_provider.add_span_processor(span_processor)
trace.set_tracer_provider(tracer_provider)

# Instrument Bedrock
BedrockInstrumentor().instrument()

# Ensure AWS credentials are set up (e.g., via environment variables or AWS CLI config)
# For a runnable example, ensure AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, AWS_REGION are set or mock boto3.
# For this quickstart, we'll use os.environ.get for region, assuming credentials are in place.
aws_region = os.environ.get('AWS_REGION', 'us-east-1')

def invoke_bedrock_model():
    try:
        bedrock_runtime = boto3.client(
            service_name='bedrock-runtime',
            region_name=aws_region
        )
        
        model_id = 'amazon.titan-text-express-v1'
        body = {
            'inputText': 'Explain the importance of OpenTelemetry.',
            'textGenerationConfig': {
                'maxTokenCount': 50,
                'stopSequences': [],
                'temperature': 0.7,
                'topP': 0.9
            }
        }

        response = bedrock_runtime.invoke_model(
            body=json.dumps(body),
            modelId=model_id,
            accept='application/json',
            contentType='application/json'
        )

        response_body = json.loads(response.get('body').read())
        print(f"Bedrock Model Response: {response_body['results'][0]['outputText'].strip()}")
    except Exception as e:
        print(f"Error invoking Bedrock model: {e}")

import json
invoke_bedrock_model()

view raw JSON →