LiveKit Anthropic Plugin

1.5.4 · active · verified Thu Apr 16

livekit-plugins-anthropic is an Agent Framework plugin for integrating Anthropic's Claude family of Large Language Models (LLMs) with LiveKit Agents. It enables developers to use Claude APIs as an LLM provider for building real-time voice AI agents, supporting both text-based conversations and vision input capabilities. The current version is 1.5.4, with releases tied to the livekit-agents framework's active development and rapid feature additions.

Common errors

Warnings

Install

Imports

Quickstart

This quickstart demonstrates how to initialize the Anthropic LLM within a LiveKit AgentSession. It highlights the essential `livekit.plugins.anthropic.LLM` class and the requirement for the `ANTHROPIC_API_KEY` environment variable. A full LiveKit agent typically processes real-time audio/text, but this snippet focuses on the Anthropic LLM setup.

import os
from livekit.agents import AgentSession, JobContext
from livekit.plugins import anthropic

# Set your Anthropic API key as an environment variable
# os.environ["ANTHROPIC_API_KEY"] = "sk-your-anthropic-key"

async def my_agent(ctx: JobContext):
    # Ensure ANTHROPIC_API_KEY is set in your environment or passed directly
    anthropic_api_key = os.environ.get('ANTHROPIC_API_KEY', '')
    if not anthropic_api_key:
        raise ValueError("ANTHROPIC_API_KEY environment variable is not set.")

    print("Starting agent session with Anthropic LLM...")
    session = AgentSession(
        llm=anthropic.LLM(
            model="claude-3-5-sonnet-20241022",
            api_key=anthropic_api_key # Can be omitted if env var is set
        )
    )

    # In a real agent, you'd process audio/text input and generate replies.
    # This is a minimal example to show LLM instantiation.
    print(f"Agent session created with LLM model: {session.llm.model}")
    # Example of a simple chat interaction (requires AgentSession context for full functionality)
    # from livekit.agents.llm import ChatContext, UserMessage
    # chat_ctx = ChatContext()
    # chat_ctx.append(UserMessage(text="Hello, what can you do?"))
    # response = await session.llm.chat(chat_ctx)
    # async for chunk in response.stream:
    #     print(chunk.delta)

    await session.astop()
    print("Agent session stopped.")

# To run this, you would typically integrate it with a LiveKit server
# and an Agent runner, like:
# if __name__ == "__main__":
#     from livekit.agents import cli
#     cli.run_agent(my_agent)

view raw JSON →