OpenTelemetry Vertex AI Instrumentation

0.58.0 · active · verified Thu Apr 09

This library provides OpenTelemetry instrumentation for Google Cloud's Vertex AI, allowing developers to trace prompts, completions, and embeddings from Vertex AI models. It integrates with the official Vertex AI Python client library. The current version is 0.58.0, with frequent releases, often multiple times a month, reflecting active development.

Warnings

Install

Imports

Quickstart

This quickstart demonstrates how to set up a basic OpenTelemetry SDK with a console exporter and enable the `opentelemetry-instrumentation-vertexai` library. It then makes a call to a Vertex AI GenerativeModel. Traces capturing the LLM interaction will be printed to the console. Ensure your Google Cloud authentication is configured (e.g., via `gcloud auth application-default login` or `GOOGLE_APPLICATION_CREDENTIALS`).

import os
import vertexai
from vertexai.generative_models import GenerativeModel
from opentelemetry import trace
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import ConsoleSpanExporter, SimpleSpanProcessor
from opentelemetry.instrumentation.vertexai import VertexAIInstrumentor

# Configure OpenTelemetry SDK
resource = Resource.create({"service.name": "vertexai-app-example"})
tracer_provider = TracerProvider(resource=resource)
tracer_provider.add_span_processor(SimpleSpanProcessor(ConsoleSpanExporter()))
trace.set_tracer_provider(tracer_provider)

# Instrument Vertex AI
VertexAIInstrumentor().instrument()

# Initialize Vertex AI (ensure GOOGLE_APPLICATION_CREDENTIALS or gcloud auth is set)
# For a real application, consider explicit project/location via env vars or arguments
project_id = os.environ.get('GCP_PROJECT_ID', 'your-gcp-project-id')
location = os.environ.get('GCP_LOCATION', 'us-central1')
vertexai.init(project=project_id, location=location)

# Use Vertex AI GenerativeModel
model = GenerativeModel("gemini-1.5-flash")

print(f"\n--- Invoking Vertex AI model ({model.model_name}) ---\n")
response = model.generate_content("Explain the importance of OpenTelemetry in AI observability.")

print("Response from model:")
for part in response.candidates[0].content.parts:
    print(part.text)
print("\n--- Traces should be visible in console output ---\n")

view raw JSON →