{"id":2646,"library":"opentelemetry-instrumentation-together","title":"OpenTelemetry Together AI Instrumentation","description":"This library provides OpenTelemetry instrumentation for applications interacting with Together AI's endpoints. It is part of the OpenLLMetry project, which extends OpenTelemetry with AI-related instrumentations to capture LLM-specific data like prompts, completions, and token usage. The library is actively maintained with a rapid release cadence, with version 0.58.0 released on 2026-04-09.","status":"active","version":"0.58.0","language":"en","source_language":"en","source_url":"https://github.com/traceloop/openllmetry/tree/main/packages/opentelemetry-instrumentation-together","tags":["opentelemetry","togetherai","ai","llm","instrumentation","tracing","observability","openllmetry"],"install":[{"cmd":"pip install opentelemetry-instrumentation-together","lang":"bash","label":"Install latest version"}],"dependencies":[{"reason":"Required Python version.","package":"python","version":">=3.10, <4","optional":false},{"reason":"Core OpenTelemetry API for instrumentation.","package":"opentelemetry-api","optional":false},{"reason":"Core OpenTelemetry SDK for trace processing and export.","package":"opentelemetry-sdk","optional":false},{"reason":"The official Together AI client library which is instrumented.","package":"together","optional":false}],"imports":[{"note":"The primary class to instrument Together AI calls.","symbol":"TogetherAiInstrumentor","correct":"from opentelemetry.instrumentation.together import TogetherAiInstrumentor"}],"quickstart":{"code":"import os\nimport together\nfrom opentelemetry import trace\nfrom opentelemetry.sdk.trace import TracerProvider\nfrom opentelemetry.sdk.trace.export import ConsoleSpanExporter, BatchSpanProcessor\nfrom opentelemetry.instrumentation.together import TogetherAiInstrumentor\n\n# Configure OpenTelemetry SDK\nprovider = TracerProvider()\nprocessor = BatchSpanProcessor(ConsoleSpanExporter())\nprovider.add_span_processor(processor)\ntrace.set_tracer_provider(provider)\n\n# Instrument Together AI\nTogetherAiInstrumentor().instrument()\n\n# Set your Together AI API key (replace with your actual key or environment variable)\n# For demonstration, we'll use a placeholder and mock the API call.\ntogether.api_key = os.environ.get('TOGETHER_API_KEY', 'sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx')\n\n# --- Mock Together AI call for a runnable example ---\n# In a real application, you would make an actual call to Together AI\n# For testing, you can use a library like 'unittest.mock' or 'pytest-mock'\n# to prevent actual API calls.\nclass MockCompletion:\n    def __init__(self, choices=None):\n        self.choices = choices or [MockChoice()]\n\nclass MockChoice:\n    def __init__(self, text=\"Mocked Together AI completion.\", logprobs=None, finish_reason=\"stop\"):\n        self.text = text\n        self.logprobs = logprobs\n        self.finish_reason = finish_reason\n\n# Temporarily patch together.Complete.create for the example\noriginal_create = together.Complete.create\ndef mock_create(*args, **kwargs):\n    print(\"Mocking together.Complete.create...\")\n    # Simulate a delay for realism in tracing\n    import time\n    time.sleep(0.1)\n    return MockCompletion()\ntogether.Complete.create = mock_create\n\ntry:\n    print(\"Making a (mocked) Together AI call...\")\n    # Example Together AI call (this will be traced)\n    response = together.Complete.create(\n        prompt=\"Tell me a short story about a brave knight.\",\n        model=\"togethercomputer/llama-2-7b-chat\",\n        max_tokens=50\n    )\n    print(f\"Response: {response.choices[0].text}\")\nfinally:\n    # Restore original method after example\n    together.Complete.create = original_create\n    # Ensure traces are flushed for console exporter\n    provider.shutdown()\n\nprint(\"Check console for OpenTelemetry traces.\")","lang":"python","description":"This quickstart demonstrates how to set up OpenTelemetry to trace calls made to Together AI. It initializes a basic OpenTelemetry `TracerProvider` with a `ConsoleSpanExporter` to print traces directly to the console. The `TogetherAiInstrumentor` is then initialized and called to automatically instrument the Together AI client library. A mocked Together AI `Complete.create` call is included to make the example runnable without requiring an actual API key, though in a real scenario, you would provide your `TOGETHER_API_KEY` and make genuine API requests."},"warnings":[{"fix":"Refer to the OpenTelemetry documentation on semantic convention stability for migration guidance. You may need to update your instrumentation library, adjust custom attributes, or use the `OTEL_SEMCONV_STABILITY_OPT_IN` environment variable (e.g., `OTEL_SEMCONV_STABILITY_OPT_IN=http/dup` for HTTP changes) during migration to emit both old and new conventions temporarily.","message":"Frequent updates to OpenTelemetry GenAI semantic conventions (e.g., in versions 0.53.0 to 0.58.0) may introduce breaking changes to span attribute names and structures. Ensure your observability backend and custom dashboards are compatible with the latest semantic conventions.","severity":"breaking","affected_versions":">=0.53.0"},{"fix":"To disable the logging of sensitive content, set the environment variable `IFTRACER_TRACE_CONTENT` to `false`. Example: `export IFTRACER_TRACE_CONTENT=false`.","message":"By default, this instrumentation logs sensitive data such as prompts, completions, and embeddings to span attributes. This is for visibility but can pose a privacy risk or increase trace size significantly.","severity":"gotcha","affected_versions":"All versions"},{"fix":"Ensure you have `opentelemetry-sdk` installed and correctly configured your `TracerProvider`, added at least one `SpanProcessor` (e.g., `BatchSpanProcessor`), and an `SpanExporter` (e.g., `OTLPSpanExporter`, `ConsoleSpanExporter`). Refer to OpenTelemetry's official Python SDK documentation for a complete setup guide.","message":"This library is an OpenTelemetry *instrumentation* and requires a full OpenTelemetry SDK setup (including a `TracerProvider`, `SpanProcessor`, and `SpanExporter`) to actually collect and export traces. Without proper SDK configuration, the instrumentation will run but produce no visible telemetry.","severity":"gotcha","affected_versions":"All versions"}],"env_vars":null,"last_verified":"2026-04-10T00:00:00.000Z","next_check":"2026-07-09T00:00:00.000Z"}