{"id":8286,"library":"livekit-plugins-anthropic","title":"LiveKit Anthropic Plugin","description":"livekit-plugins-anthropic is an Agent Framework plugin for integrating Anthropic's Claude family of Large Language Models (LLMs) with LiveKit Agents. It enables developers to use Claude APIs as an LLM provider for building real-time voice AI agents, supporting both text-based conversations and vision input capabilities. The current version is 1.5.4, with releases tied to the livekit-agents framework's active development and rapid feature additions.","status":"active","version":"1.5.4","language":"en","source_language":"en","source_url":"https://github.com/livekit/agents","tags":["ai","audio","livekit","realtime","video","voice","webrtc","llm","anthropic","agent","claude"],"install":[{"cmd":"pip install livekit-plugins-anthropic","lang":"bash","label":"Install latest version"}],"dependencies":[{"reason":"Core framework for building LiveKit agents; this is a plugin for it.","package":"livekit-agents","optional":false},{"reason":"Official Anthropic Python client library, required for API interaction.","package":"anthropic","optional":false}],"imports":[{"symbol":"LLM","correct":"from livekit.plugins import anthropic\nllm_instance = anthropic.LLM(...)"}],"quickstart":{"code":"import os\nfrom livekit.agents import AgentSession, JobContext\nfrom livekit.plugins import anthropic\n\n# Set your Anthropic API key as an environment variable\n# os.environ[\"ANTHROPIC_API_KEY\"] = \"sk-your-anthropic-key\"\n\nasync def my_agent(ctx: JobContext):\n    # Ensure ANTHROPIC_API_KEY is set in your environment or passed directly\n    anthropic_api_key = os.environ.get('ANTHROPIC_API_KEY', '')\n    if not anthropic_api_key:\n        raise ValueError(\"ANTHROPIC_API_KEY environment variable is not set.\")\n\n    print(\"Starting agent session with Anthropic LLM...\")\n    session = AgentSession(\n        llm=anthropic.LLM(\n            model=\"claude-3-5-sonnet-20241022\",\n            api_key=anthropic_api_key # Can be omitted if env var is set\n        )\n    )\n\n    # In a real agent, you'd process audio/text input and generate replies.\n    # This is a minimal example to show LLM instantiation.\n    print(f\"Agent session created with LLM model: {session.llm.model}\")\n    # Example of a simple chat interaction (requires AgentSession context for full functionality)\n    # from livekit.agents.llm import ChatContext, UserMessage\n    # chat_ctx = ChatContext()\n    # chat_ctx.append(UserMessage(text=\"Hello, what can you do?\"))\n    # response = await session.llm.chat(chat_ctx)\n    # async for chunk in response.stream:\n    #     print(chunk.delta)\n\n    await session.astop()\n    print(\"Agent session stopped.\")\n\n# To run this, you would typically integrate it with a LiveKit server\n# and an Agent runner, like:\n# if __name__ == \"__main__\":\n#     from livekit.agents import cli\n#     cli.run_agent(my_agent)\n","lang":"python","description":"This quickstart demonstrates how to initialize the Anthropic LLM within a LiveKit AgentSession. It highlights the essential `livekit.plugins.anthropic.LLM` class and the requirement for the `ANTHROPIC_API_KEY` environment variable. A full LiveKit agent typically processes real-time audio/text, but this snippet focuses on the Anthropic LLM setup."},"warnings":[{"fix":"To disable preemptive generation, initialize AgentSession with `AgentSession(preemptive_generation=False)`.","message":"Starting with livekit-agents 1.5.0, preemptive generation is enabled by default. This alters how LLM and TTS inference begins before a user's turn ends, potentially changing latency characteristics.","severity":"breaking","affected_versions":">=1.5.0"},{"fix":"Upgrade to livekit-plugins-anthropic version 1.4.5 or newer to ensure correct handling of Claude 4.6+ models.","message":"Older versions of livekit-plugins-anthropic (before 1.4.5) might experience issues with 'trailing assistant turns' when using Claude 4.6+ models, potentially leading to incorrect responses.","severity":"gotcha","affected_versions":"<1.4.5"},{"fix":"Ensure `max_tokens` is set sufficiently higher than `budget_tokens` in your LLM configuration. Refer to Anthropic's documentation for minimum cacheable block sizes.","message":"When using Anthropic prompt caching, the `max_tokens` parameter must be strictly greater than `budget_tokens` to avoid configuration errors. Very short system prompts or tool lists may also not qualify for caching.","severity":"gotcha","affected_versions":"All"},{"fix":"Carefully validate tool outputs from the Anthropic LLM. Monitor GitHub issues for updates on `strict` tool schema support in livekit-plugins-anthropic. If strict validation is critical, implement manual validation or consider alternative tool invocation strategies.","message":"The Anthropic plugin may not generate tool schemas with strict mode constraints, even though Anthropic's API supports a `strict` field on tool definitions for guaranteed schema conformance. This could lead to the model returning incompatible types or missing required fields.","severity":"gotcha","affected_versions":"All known versions up to 1.5.4 (as of March 2026 issue)"}],"env_vars":null,"last_verified":"2026-04-16T00:00:00.000Z","next_check":"2026-07-15T00:00:00.000Z","problems":[{"fix":"Set the `ANTHROPIC_API_KEY` environment variable (e.g., `export ANTHROPIC_API_KEY='sk-your-key'`) or pass it directly when initializing the LLM: `anthropic.LLM(api_key='sk-your-key', ...)`.","cause":"The Anthropic API key has not been provided, either through the `api_key` parameter in `anthropic.LLM` or by setting the `ANTHROPIC_API_KEY` environment variable.","error":"ValueError: Anthropic API key is required, either as argument or set ANTHROPIC_API_KEY environmental variable"},{"fix":"Review Anthropic's API rate limit documentation for your specific usage tier. Implement retry logic with exponential backoff in your agent, or consider upgrading your Anthropic plan if consistent higher throughput is needed.","cause":"Your Anthropic API usage has exceeded the rate limits for your account's usage tier, measured in requests per minute, tokens per minute, or tokens per day.","error":"anthropic.APIStatusError: Your account has hit a rate limit. (HTTP status: 429)"},{"fix":"Ensure you are using `livekit-plugins-anthropic` version 1.5.x or newer. This issue was addressed in recent updates to the library. If using an older version, update your package.","cause":"This was a known bug in earlier versions of livekit-plugins-anthropic (specifically observed before version 1.5.x) when used with `voice_assistant` agents and function calls.","error":"The LLM repeats the exact same stream text output after triggering a function call."}]}