LlamaIndex Bedrock Converse LLM
The `llama-index-llms-bedrock-converse` library provides an integration for LlamaIndex with Amazon Bedrock's Converse API. LlamaIndex is an open-source framework for building LLM applications over custom data. This integration allows users to leverage various Amazon Bedrock models, such as Claude, Command, and Mistral Large, with native support for function calling and streaming, offering a unified and recommended approach to using Bedrock LLMs within LlamaIndex applications.
Warnings
- deprecated The older `llama-index-llms-bedrock` package is deprecated in favor of `llama-index-llms-bedrock-converse`. The Converse API is the recommended way to use Bedrock LLMs with LlamaIndex.
- gotcha When using an AWS Application Inference Profile ARN, the `model` argument provided to `BedrockConverse` must accurately match the underlying model referenced by the profile. `BedrockConverse` does not perform this validation, and a mismatch can lead to undefined behavior.
- gotcha AWS authentication and region configuration are critical. Errors like 'Invalid payload!' can occur if the ECS task or deployment environment lacks the necessary IAM permissions (`bedrock:InvokeModel`), has incorrect region settings, or an improperly configured VPC (missing internet access or VPC endpoint).
- breaking LlamaIndex v0.10.0 introduced a significant packaging refactor, splitting integrations (like this one) into separate PyPI packages. The `ServiceContext` abstraction was deprecated and later fully removed in v0.11, requiring direct specification of LLM and embedding arguments or use of the new `Settings` object. `LLMPredictor` was also deprecated.
- breaking A regression in `llama-index-llms-bedrock-converse` versions around `0.12.11` to `0.14.x` can cause `ValidationException` when using 'extended thinking' and tools. This is due to changes in how `reasoningContent` (specifically `signature`) is handled in API responses, where `reasoningContent["text"]` is expected but not always present.
Install
-
pip install llama-index-llms-bedrock-converse
Imports
- BedrockConverse
from llama_index.llms.bedrock_converse import BedrockConverse
Quickstart
import os
from llama_index.llms.bedrock_converse import BedrockConverse
# Ensure your AWS credentials and region are set via environment variables
# or AWS CLI configuration (e.g., AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, AWS_REGION)
# Replace with a model that supports the Converse API, e.g., 'anthropic.claude-3-haiku-20240307-v1:0'
# or 'anthropic.claude-3-sonnet-20240229-v1:0'
llm = BedrockConverse(
model="anthropic.claude-3-sonnet-20240229-v1:0",
region_name=os.environ.get("AWS_REGION", "us-east-1"),
# profile_name="default" # Uncomment and set if using an AWS profile
# aws_access_key_id=os.environ.get("AWS_ACCESS_KEY_ID", ""),
# aws_secret_access_key=os.environ.get("AWS_SECRET_ACCESS_KEY", ""),
# aws_session_token=os.environ.get("AWS_SESSION_TOKEN", ""),
)
response = llm.complete("Tell me a short story about a brave knight.")
print(response.text)