LlamaIndex Bedrock Converse LLM

0.14.5 · active · verified Wed Apr 15

The `llama-index-llms-bedrock-converse` library provides an integration for LlamaIndex with Amazon Bedrock's Converse API. LlamaIndex is an open-source framework for building LLM applications over custom data. This integration allows users to leverage various Amazon Bedrock models, such as Claude, Command, and Mistral Large, with native support for function calling and streaming, offering a unified and recommended approach to using Bedrock LLMs within LlamaIndex applications.

Warnings

Install

Imports

Quickstart

Initializes the BedrockConverse LLM and performs a simple text completion. AWS credentials (profile, access keys, or environment variables) and a valid Bedrock Converse model ID are required.

import os
from llama_index.llms.bedrock_converse import BedrockConverse

# Ensure your AWS credentials and region are set via environment variables
# or AWS CLI configuration (e.g., AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, AWS_REGION)

# Replace with a model that supports the Converse API, e.g., 'anthropic.claude-3-haiku-20240307-v1:0'
# or 'anthropic.claude-3-sonnet-20240229-v1:0'
llm = BedrockConverse(
    model="anthropic.claude-3-sonnet-20240229-v1:0",
    region_name=os.environ.get("AWS_REGION", "us-east-1"),
    # profile_name="default" # Uncomment and set if using an AWS profile
    # aws_access_key_id=os.environ.get("AWS_ACCESS_KEY_ID", ""),
    # aws_secret_access_key=os.environ.get("AWS_SECRET_ACCESS_KEY", ""),
    # aws_session_token=os.environ.get("AWS_SESSION_TOKEN", ""),
)

response = llm.complete("Tell me a short story about a brave knight.")
print(response.text)

view raw JSON →