{"id":2159,"library":"opentelemetry-instrumentation-langchain","title":"OpenTelemetry Langchain Instrumentation","description":"This library provides OpenTelemetry instrumentation for applications built with LangChain, enabling comprehensive tracing of LLM interactions and associated components. It is part of the OpenLLMetry project, which extends OpenTelemetry for enhanced LLM observability. The project maintains a rapid release cadence, with frequent updates often driven by advancements in OpenTelemetry's Generative AI semantic conventions.","status":"active","version":"0.58.0","language":"en","source_language":"en","source_url":"https://github.com/traceloop/openllmetry/tree/main/packages/opentelemetry-instrumentation-langchain","tags":["opentelemetry","langchain","llm","observability","tracing","ai"],"install":[{"cmd":"pip install opentelemetry-instrumentation-langchain","lang":"bash","label":"Install package"}],"dependencies":[{"reason":"Core dependency for LangChain applications, which this package instruments.","package":"langchain-core"},{"reason":"OpenTelemetry API for basic tracing constructs.","package":"opentelemetry-api"},{"reason":"OpenTelemetry SDK for configuring tracing providers, processors, and exporters.","package":"opentelemetry-sdk"},{"reason":"Defines standard attribute names and values for OpenTelemetry spans, especially for Generative AI.","package":"opentelemetry-semantic-conventions"},{"reason":"Used for function wrapping and instrumentation.","package":"wrapt"}],"imports":[{"symbol":"LangchainInstrumentor","correct":"from opentelemetry.instrumentation.langchain import LangchainInstrumentor"}],"quickstart":{"code":"import os\nfrom opentelemetry import trace\nfrom opentelemetry.sdk.resources import Resource\nfrom opentelemetry.sdk.trace import TracerProvider\nfrom opentelemetry.sdk.trace.export import ConsoleSpanExporter, SimpleSpanProcessor\nfrom opentelemetry.instrumentation.langchain import LangchainInstrumentor\n\nfrom langchain_openai import ChatOpenAI\nfrom langchain_core.prompts import ChatPromptTemplate\nfrom langchain_core.output_parsers import StrOutputParser\n\n# 1. Configure OpenTelemetry SDK (essential for any OTel instrumentation)\n# Set up a basic console exporter for demonstration\nresource = Resource.create({\"service.name\": \"my-langchain-app\"})\ntracer_provider = TracerProvider(resource=resource)\ntracer_provider.add_span_processor(SimpleSpanProcessor(ConsoleSpanExporter()))\ntrace.set_tracer_provider(tracer_provider)\n\n# 2. Instrument Langchain\nLangchainInstrumentor().instrument()\n\n# Optional: Disable sensitive content logging if needed (e.g., for production)\n# os.environ['TRACELOOP_TRACE_CONTENT'] = 'false'\n\n# 3. Use Langchain as usual\nos.environ['OPENAI_API_KEY'] = os.environ.get('OPENAI_API_KEY', 'sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx') # Replace with actual key or secure handling\n\nllm = ChatOpenAI(temperature=0)\nprompt = ChatPromptTemplate.from_messages([\n    (\"system\", \"You are a helpful AI assistant.\"),\n    (\"user\", \"Tell me a short story about {topic}.\")\n])\noutput_parser = StrOutputParser()\n\nchain = prompt | llm | output_parser\n\nif __name__ == \"__main__\":\n    print(\"Running LangChain application...\")\n    response = chain.invoke({\"topic\": \"a cat detective\"})\n    print(\"\\nLangChain Response:\")\n    print(response)\n    print(\"\\nCheck console output for OpenTelemetry traces.\")","lang":"python","description":"This quickstart demonstrates how to instrument a basic LangChain application using `opentelemetry-instrumentation-langchain`. It includes the necessary OpenTelemetry SDK setup with a `ConsoleSpanExporter` to print traces to the console, followed by the `LangchainInstrumentor().instrument()` call. A simple LangChain expression language (LCEL) chain is then invoked to generate a story, and its execution will be automatically traced by OpenTelemetry."},"warnings":[{"fix":"Review the release notes and official OpenTelemetry Generative AI semantic conventions documentation when upgrading. You may need to update dashboards, queries, and custom processors that rely on specific span attributes. Using `OTEL_SEMCONV_STABILITY_OPT_IN=latest` can help with migration during transitions.","message":"Frequent updates to OpenTelemetry Generative AI semantic conventions can introduce breaking changes in span attribute names and trace structures. Recent versions (e.g., v0.53.0, v0.55.0 onwards) specifically align with new semantic conventions.","severity":"breaking","affected_versions":">=0.53.0"},{"fix":"To disable the logging of content for privacy reasons, set the environment variable `TRACELOOP_TRACE_CONTENT` to `false`. `os.environ['TRACELOOP_TRACE_CONTENT'] = 'false'`.","message":"By default, this instrumentation logs sensitive data such as prompts, completions, and embeddings directly to span attributes. This provides visibility but may expose private user data.","severity":"gotcha","affected_versions":"All versions"},{"fix":"Ensure you configure the OpenTelemetry SDK before instrumenting. This involves setting up a `TracerProvider`, adding a `SpanProcessor` (e.g., `BatchSpanProcessor` for production), and configuring an appropriate `SpanExporter` (e.g., `OTLPSpanExporter` to send to a collector).","message":"The `LangchainInstrumentor().instrument()` call only enables the instrumentation. For traces to be collected and exported, a complete OpenTelemetry SDK setup (TracerProvider, SpanProcessor, SpanExporter) is required.","severity":"gotcha","affected_versions":"All versions"}],"env_vars":null,"last_verified":"2026-04-09T00:00:00.000Z","next_check":"2026-07-08T00:00:00.000Z"}