OpenTelemetry Instrumentation for Google GenAI
This OpenTelemetry instrumentation provides automatic tracing and metrics for applications using Google's Generative AI client library (`google-generativeai`). It captures details like model calls, prompts, and responses, aligning with OpenTelemetry's GenAI semantic conventions. As part of the `opentelemetry-python-contrib` project, it is currently in beta (`0.7b0`) and subject to rapid development and potential API changes.
Warnings
- breaking This instrumentation is currently in beta (`0.7b0`) as part of `opentelemetry-python-contrib`. This means its API, behavior, and emitted attributes are subject to change without strict adherence to semantic versioning until a stable `1.0.0` release.
- gotcha Prior to version `0.7b0`, there was a known bug where token counts for streaming `generateContent` methods might be inaccurate. If relying on token metrics for cost or rate limiting, ensure you are on `0.7b0` or later.
- gotcha Custom attributes cannot be added to `generate_content {model.name}` spans or `gen_ai.client.inference.operation.details` log events via the OpenTelemetry Context API prior to version `0.6b0`. Attempting to do so on older versions will have no effect.
- gotcha The instrumentation relies on the `google-generativeai` library. Ensure this library is installed alongside the OpenTelemetry instrumentation and properly configured with an API key, as the instrumentation itself does not handle API key management or client initialization.
Install
-
pip install opentelemetry-instrumentation-google-genai
Imports
- GoogleGenAIInstrumentor
from opentelemetry.instrumentation.google_genai import GoogleGenAIInstrumentor
Quickstart
import os
import google.generativeai as genai
from opentelemetry import trace
from opentelemetry.instrumentation.google_genai import GoogleGenAIInstrumentor
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import ConsoleSpanExporter, SimpleSpanProcessor
# 1. Configure OpenTelemetry SDK
resource = Resource.create(attributes={
"service.name": "my-genai-app",
"application.name": "google-gemini-example"
})
provider = TracerProvider(resource=resource)
# For local testing, use ConsoleSpanExporter to print traces to console
provider.add_span_processor(SimpleSpanProcessor(ConsoleSpanExporter()))
trace.set_tracer_provider(provider)
# 2. Enable Google GenAI instrumentation
GoogleGenAIInstrumentor().instrument()
# 3. Configure Google Generative AI
# Ensure GEMINI_API_KEY environment variable is set or pass directly
api_key = os.environ.get('GEMINI_API_KEY', 'YOUR_API_KEY_HERE')
if not api_key or api_key == 'YOUR_API_KEY_HERE':
print("Warning: GEMINI_API_KEY environment variable not set. API calls might fail.")
genai.configure(api_key=api_key)
# 4. Use Google GenAI - this will now be instrumented
print("\n--- Generating Content ---")
tracer = trace.get_tracer(__name__)
with tracer.start_as_current_span("generate-story"): # Custom span to wrap GenAI call
model = genai.GenerativeModel('gemini-pro')
try:
response = model.generate_content("Tell me a short, imaginative story about a cat who learns to fly.")
print("Generated story:\n" + response.text)
except Exception as e:
print(f"Error generating content: {e}")
print("\n--- Traces (if configured) should appear above/below ---")