Anthropic LLM Integration for LlamaIndex
The `llama-index-llms-anthropic` package provides an integration for using Anthropic's Claude models within the LlamaIndex framework. Anthropic is an AI research company focused on developing advanced language models, notably the Claude series, prioritizing safety and alignment. This integration allows LlamaIndex applications to leverage Anthropic's models for various LLM operations. The current version is `0.11.2`, and it follows LlamaIndex's rapid release cadence with frequent updates.
Warnings
- breaking LlamaIndex Core v0.10/v0.11 introduced significant breaking changes. `ServiceContext` was completely removed in favor of `Settings`, and `LLMPredictor` was deprecated. Code written for older LlamaIndex versions (pre-0.10) relying on these abstractions will break.
- gotcha Ensure your `ANTHROPIC_API_KEY` is correctly configured. The `Anthropic` class primarily looks for the `ANTHROPIC_API_KEY` environment variable. If it's not set or invalid, API calls will fail with authentication errors.
- gotcha For accurate token counting with Anthropic models (especially newer Claude 3 models), it is crucial to explicitly set the LlamaIndex global tokenizer to the Anthropic tokenizer. The default LlamaIndex tokenizer (often `tiktoken`) will lead to incorrect token counts and can cause context overflow errors or unexpected behavior.
- gotcha Dependency conflicts, especially with the underlying `anthropic` Python client library, can arise when combining `llama-index-llms-anthropic` with other `llama-index` integrations (e.g., `llama-index-multi-modal-llms-anthropic`). Older versions of certain integrations might have strict, incompatible `anthropic` client version requirements.
Install
-
pip install llama-index-llms-anthropic llama-index
Imports
- Anthropic
from llama_index.llms.anthropic import Anthropic
- Settings
from llama_index.core.service_context import ServiceContext
from llama_index.core import Settings
- ChatMessage
from llama_index.llms.anthropic import ChatMessage
from llama_index.core.llms import ChatMessage
Quickstart
import os
from llama_index.llms.anthropic import Anthropic
from llama_index.core import Settings
os.environ["ANTHROPIC_API_KEY"] = os.environ.get("ANTHROPIC_API_KEY", "YOUR_ANTHROPIC_API_KEY")
# Initialize the Anthropic LLM
llm = Anthropic(model="claude-3-opus-20240229")
# Set the tokenizer for accurate token counting (important for Anthropic models)
Settings.tokenizer = llm.tokenizer
# Make a completion call
resp = llm.complete("What is the capital of France?")
print(resp)