LlamaIndex Bedrock Embeddings Integration

0.8.0 · active · verified Thu Apr 16

The `llama-index-embeddings-bedrock` library provides robust integration for Amazon Bedrock embedding models within the LlamaIndex framework. It allows developers to leverage various AWS Bedrock models like Amazon Titan and Cohere for generating text embeddings. The library is actively maintained, with version 0.8.0 released on March 12, 2026, and receives frequent updates to align with LlamaIndex core and Bedrock API changes.

Common errors

Warnings

Install

Imports

Quickstart

This quickstart demonstrates how to initialize the `BedrockEmbedding` class, configure AWS credentials and region, and generate an embedding for a given text. It highlights using environment variables for sensitive information.

import os
from llama_index.embeddings.bedrock import BedrockEmbedding

# Configure AWS credentials and region via environment variables or explicitly
# os.environ['AWS_ACCESS_KEY_ID'] = 'YOUR_ACCESS_KEY'
# os.environ['AWS_SECRET_ACCESS_KEY'] = 'YOUR_SECRET_KEY'
# os.environ['AWS_REGION_NAME'] = 'us-east-1'

# Initialize the embedding model
embed_model = BedrockEmbedding(
    model_name="cohere.embed-english-v3", # Example model, choose from supported models
    region_name=os.environ.get('AWS_REGION_NAME', 'us-east-1'),
    # Optionally, specify credentials directly or via profile_name
    aws_access_key_id=os.environ.get('AWS_ACCESS_KEY_ID'),
    aws_secret_access_key=os.environ.get('AWS_SECRET_ACCESS_KEY'),
    # profile_name='my-aws-profile'
)

# Get a single embedding
text = "Hello, world! This is a test document."
embedding = embed_model.get_text_embedding(text)
print(f"Embedding length: {len(embedding)}")
print(f"First 5 embedding values: {embedding[:5]}")

# List supported models
# supported_models = BedrockEmbedding.list_supported_models()
# print("Supported models:", supported_models)

view raw JSON →