{"id":6705,"library":"llama-index-llms-anthropic","title":"Anthropic LLM Integration for LlamaIndex","description":"The `llama-index-llms-anthropic` package provides an integration for using Anthropic's Claude models within the LlamaIndex framework. Anthropic is an AI research company focused on developing advanced language models, notably the Claude series, prioritizing safety and alignment. This integration allows LlamaIndex applications to leverage Anthropic's models for various LLM operations. The current version is `0.11.2`, and it follows LlamaIndex's rapid release cadence with frequent updates.","status":"active","version":"0.11.2","language":"en","source_language":"en","source_url":"https://github.com/run-llama/llama_index/tree/main/llama-index-integrations/llms/llama-index-llms-anthropic","tags":["LLM","Anthropic","LlamaIndex","AI","Integration"],"install":[{"cmd":"pip install llama-index-llms-anthropic llama-index","lang":"bash","label":"Install package and LlamaIndex core"}],"dependencies":[{"reason":"Required for core LlamaIndex functionalities like Settings and ChatMessage objects.","package":"llama-index-core","optional":false},{"reason":"The underlying Python client library for interacting with Anthropic API.","package":"anthropic","optional":false},{"reason":"Python version compatibility.","package":"python","optional":false}],"imports":[{"symbol":"Anthropic","correct":"from llama_index.llms.anthropic import Anthropic"},{"note":"ServiceContext was deprecated in LlamaIndex v0.11 and replaced by Settings.","wrong":"from llama_index.core.service_context import ServiceContext","symbol":"Settings","correct":"from llama_index.core import Settings"},{"note":"ChatMessage is a core LlamaIndex object, not specific to the Anthropic integration.","wrong":"from llama_index.llms.anthropic import ChatMessage","symbol":"ChatMessage","correct":"from llama_index.core.llms import ChatMessage"}],"quickstart":{"code":"import os\nfrom llama_index.llms.anthropic import Anthropic\nfrom llama_index.core import Settings\n\nos.environ[\"ANTHROPIC_API_KEY\"] = os.environ.get(\"ANTHROPIC_API_KEY\", \"YOUR_ANTHROPIC_API_KEY\")\n\n# Initialize the Anthropic LLM\nllm = Anthropic(model=\"claude-3-opus-20240229\")\n\n# Set the tokenizer for accurate token counting (important for Anthropic models)\nSettings.tokenizer = llm.tokenizer\n\n# Make a completion call\nresp = llm.complete(\"What is the capital of France?\")\nprint(resp)\n","lang":"python","description":"This quickstart demonstrates how to initialize the Anthropic LLM, set the Anthropic API key from an environment variable, configure the LlamaIndex global tokenizer for accurate token counting, and make a basic text completion call using a Claude 3 Opus model."},"warnings":[{"fix":"Migrate from `ServiceContext` to `Settings` (e.g., `from llama_index.core import Settings`). Update `LLMPredictor` usage to directly use `LLM` classes or other LlamaIndex components. Consult the LlamaIndex migration guides.","message":"LlamaIndex Core v0.10/v0.11 introduced significant breaking changes. `ServiceContext` was completely removed in favor of `Settings`, and `LLMPredictor` was deprecated. Code written for older LlamaIndex versions (pre-0.10) relying on these abstractions will break.","severity":"breaking","affected_versions":"LlamaIndex Core <0.10"},{"fix":"Set the `ANTHROPIC_API_KEY` environment variable or pass the `api_key` argument explicitly during `Anthropic` class initialization: `llm = Anthropic(api_key=\"sk-...\")`.","message":"Ensure your `ANTHROPIC_API_KEY` is correctly configured. The `Anthropic` class primarily looks for the `ANTHROPIC_API_KEY` environment variable. If it's not set or invalid, API calls will fail with authentication errors.","severity":"gotcha","affected_versions":"All versions"},{"fix":"After initializing your `Anthropic` LLM instance, assign its tokenizer to `Settings.tokenizer`: `Settings.tokenizer = llm.tokenizer`.","message":"For accurate token counting with Anthropic models (especially newer Claude 3 models), it is crucial to explicitly set the LlamaIndex global tokenizer to the Anthropic tokenizer. The default LlamaIndex tokenizer (often `tiktoken`) will lead to incorrect token counts and can cause context overflow errors or unexpected behavior.","severity":"gotcha","affected_versions":"All versions"},{"fix":"Always check the specific version requirements of all `llama-index` packages and their dependencies. Consider upgrading all `llama-index-*` packages to their latest compatible versions or explicitly managing `anthropic` client versions to resolve conflicts.","message":"Dependency conflicts, especially with the underlying `anthropic` Python client library, can arise when combining `llama-index-llms-anthropic` with other `llama-index` integrations (e.g., `llama-index-multi-modal-llms-anthropic`). Older versions of certain integrations might have strict, incompatible `anthropic` client version requirements.","severity":"gotcha","affected_versions":"All versions, especially when combining with other `llama-index` integrations."}],"env_vars":null,"last_verified":"2026-04-15T00:00:00.000Z","next_check":"2026-07-14T00:00:00.000Z","problems":[]}