{"id":2637,"library":"opentelemetry-instrumentation-mcp","title":"OpenTelemetry MCP Instrumentation (OpenLLMetry)","description":"The `opentelemetry-instrumentation-mcp` package provides automatic OpenTelemetry instrumentation for a wide array of Large Language Models (LLMs), vector databases, and other AI frameworks. As part of the Traceloop `openllmetry` project, it acts as a meta-package, consolidating over 30 individual instrumentations for popular libraries like OpenAI, LangChain, Anthropic, and Pinecone into a single installation. It is currently at version 0.58.0 and follows a rapid release cadence with frequent updates.","status":"active","version":"0.58.0","language":"en","source_language":"en","source_url":"https://github.com/traceloop/openllmetry/tree/main/packages/opentelemetry-instrumentation-mcp","tags":["opentelemetry","instrumentation","ai","llm","observability","tracing","llmops","traceloop","openllmetry"],"install":[{"cmd":"pip install opentelemetry-instrumentation-mcp openllmetry-sdk","lang":"bash","label":"Install core OpenLLMetry SDK and MCP meta-instrumentation"},{"cmd":"pip install openai","lang":"bash","label":"Example: Install an LLM client (e.g., OpenAI) for instrumentation"}],"dependencies":[{"reason":"Required to initialize and enable the instrumentations provided by this meta-package.","package":"openllmetry-sdk"},{"reason":"Core OpenTelemetry API, often a transitive dependency of openllmetry-sdk.","package":"opentelemetry-api","optional":true},{"reason":"Core OpenTelemetry SDK, often a transitive dependency of openllmetry-sdk.","package":"opentelemetry-sdk","optional":true},{"reason":"An example of a library that this package instruments. Many other LLM/AI framework clients are implicitly instrumented upon installation.","package":"openai","optional":true}],"imports":[{"note":"The `opentelemetry-instrumentation-mcp` package is a meta-package that installs numerous individual AI/LLM instrumentations. The `init` function from `openllmetry.sdk` is used to enable all installed OpenLLMetry instrumentations.","symbol":"init","correct":"from openllmetry.sdk import init"}],"quickstart":{"code":"import os\nfrom openllmetry.sdk import init\nfrom openai import OpenAI\n\n# Initialize OpenLLMetry to enable instrumentation\ninit()\n\n# Ensure OPENAI_API_KEY is set in your environment\nopenai_api_key = os.environ.get('OPENAI_API_KEY', '')\nif not openai_api_key:\n    print(\"Warning: OPENAI_API_KEY not set. Skipping OpenAI call.\")\nelse:\n    print(\"OpenAI API Key found. Making a sample call...\")\n    client = OpenAI(api_key=openai_api_key)\n    try:\n        completion = client.chat.completions.create(\n            model=\"gpt-3.5-turbo\",\n            messages=[\n                {\"role\": \"user\", \"content\": \"What is OpenTelemetry?\"}\n            ]\n        )\n        print(f\"OpenAI response: {completion.choices[0].message.content[:50]}...\")\n        print(\"Traces for this call should be visible in your configured OpenTelemetry collector.\")\n    except Exception as e:\n        print(f\"Error during OpenAI call: {e}\")","lang":"python","description":"This quickstart demonstrates how to initialize OpenLLMetry using `openllmetry.sdk.init()` after installing the `opentelemetry-instrumentation-mcp` meta-package. It then makes a sample call to OpenAI, which will be automatically traced. Ensure your `OPENAI_API_KEY` is set as an environment variable and an OpenTelemetry collector is configured (e.g., via `OTEL_EXPORTER_OTLP_ENDPOINT`)."},"warnings":[{"fix":"Regularly update `opentelemetry-instrumentation-mcp` and `openllmetry-sdk` to the latest versions. Adjust any downstream monitoring queries or dashboards to reflect new semantic convention attribute names as they evolve.","message":"Frequent updates to OpenTelemetry GenAI Semantic Conventions may lead to changes in span and attribute names. This is common in a rapidly evolving ecosystem.","severity":"breaking","affected_versions":"All versions before 0.58.0 (e.g., v0.55.0, v0.58.0 specifically mention GenAI semconv updates). Users should refer to release notes for exact attribute changes."},{"fix":"If you only need instrumentation for a specific LLM or AI framework (e.g., OpenAI), consider installing only `opentelemetry-instrumentation-openai` (and `openllmetry-sdk`) instead of the `mcp` meta-package to keep your dependency footprint smaller.","message":"This package (`opentelemetry-instrumentation-mcp`) is a meta-package, meaning installing it brings in a large number of individual `opentelemetry-instrumentation-*` packages. This can lead to a large dependency tree and potential version conflicts with other libraries in your project.","severity":"gotcha","affected_versions":"All versions."},{"fix":"Ensure your project uses the `pinecone` package (version >=3.0.0 typically) instead of `pinecone-client` if you intend for Pinecone calls to be instrumented. Update your `requirements.txt` accordingly.","message":"Instrumentation for Pinecone switched from the deprecated `pinecone-client` to the `pinecone` package.","severity":"breaking","affected_versions":"Prior to 0.53.0. Versions from 0.53.0 onwards use the new `pinecone` package."}],"env_vars":null,"last_verified":"2026-04-10T00:00:00.000Z","next_check":"2026-07-09T00:00:00.000Z"}