OpenInference Semantic Conventions
OpenInference Semantic Conventions defines a standardized set of attributes and event names for capturing telemetry data related to AI/ML and LLM applications, building on OpenTelemetry standards. It provides a common vocabulary for observability across different AI frameworks. The current version is 0.1.28 and it is part of an actively developed monorepo with frequent releases for various instrumentations.
Warnings
- breaking As a 0.x release, the OpenInference semantic conventions may undergo non-backward compatible changes. Attribute names or their expected values could change between minor versions.
- gotcha This library solely defines constants for semantic conventions. It must be used in conjunction with an OpenTelemetry SDK (e.g., `opentelemetry-sdk`) to actually capture and export traces. Installing this package alone will not emit any telemetry.
- gotcha It's crucial to correctly identify which specific attribute constant applies to your data (e.g., `SpanAttributes.LLM_MODEL_NAME` vs. `SpanAttributes.LLM_VENDOR`). Using the wrong attribute, even if similar, can lead to incorrect interpretation by observability platforms.
Install
-
pip install openinference-semantic-conventions
Imports
- SpanAttributes
from openinference.semconv.trace import SpanAttributes
- ResourceAttributes
from openinference.semconv.resource import ResourceAttributes
- EventAttributes
from openinference.semconv.trace import EventAttributes
Quickstart
import opentelemetry.trace as trace
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import ConsoleSpanExporter, SimpleSpanProcessor
from openinference.semconv.trace import SpanAttributes, EventAttributes
# Configure OpenTelemetry Tracer (for demonstration purposes)
provider = TracerProvider()
processor = SimpleSpanProcessor(ConsoleSpanExporter())
provider.add_span_processor(processor)
trace.set_tracer_provider(provider)
tracer = trace.get_tracer(__name__)
# Example: Tracing an LLM call with OpenInference Semantic Conventions
with tracer.start_as_current_span("my-llm-generation") as span:
span.set_attribute(SpanAttributes.LLM_MODEL_NAME, "gpt-4")
span.set_attribute(SpanAttributes.LLM_VENDOR, "OpenAI")
span.set_attribute(SpanAttributes.LLM_REQUEST_TYPE, "completion")
span.set_attribute(SpanAttributes.INPUT_VALUE, "What is the capital of France?")
span.set_attribute(SpanAttributes.OUTPUT_VALUE, "Paris.")
span.set_attribute(SpanAttributes.TOKEN_COUNT_TOTAL, 5)
span.add_event("tool_call", attributes={
EventAttributes.TOOL_CALL_NAME: "search_engine",
EventAttributes.TOOL_CALL_ARGUMENTS: '{"query": "capital of France"}'
})
print("OpenInference-compliant trace emitted to console.")