LangChain AWS Integrations

1.4.3 · active · verified Thu Apr 09

langchain-aws provides a robust integration layer connecting AWS services like Amazon Bedrock, Amazon SageMaker, and Amazon Comprehend with the LangChain framework. It simplifies access to AWS-powered LLMs, embeddings, and vector stores within LangChain applications. The library is actively maintained with frequent releases, often bi-weekly or monthly, to support new AWS features and LangChain capabilities. The current version is 1.4.3.

Warnings

Install

Imports

Quickstart

This quickstart demonstrates how to initialize `ChatBedrock` and use it to invoke a large language model. It requires the `AWS_REGION_NAME` environment variable to be set, and valid AWS credentials configured for `boto3` to access Amazon Bedrock.

import os
from langchain_aws.chat_models import ChatBedrock
from langchain_core.messages import HumanMessage

# Ensure AWS_REGION_NAME environment variable is set
# and AWS credentials are configured (e.g., via ~/.aws/credentials or env vars)

region = os.environ.get("AWS_REGION_NAME", "us-east-1")
model_id = "anthropic.claude-3-sonnet-20240229-v1:0" # Example Bedrock model ID

try:
    llm = ChatBedrock(
        model_id=model_id,
        region_name=region
    )

    messages = [
        HumanMessage(
            content="Tell me a short story about a brave knight and a wise dragon."
        )
    ]

    print(f"Invoking {model_id} in {region}...")
    response = llm.invoke(messages)
    print("\n--- Response ---")
    print(response.content)

except Exception as e:
    print(f"An error occurred: {e}")
    print("Please ensure AWS_REGION_NAME is set and AWS credentials are configured.")
    print("Example: export AWS_REGION_NAME=us-east-1")
    print("See https://boto3.amazonaws.com/v1/documentation/api/latest/guide/credentials.html")

view raw JSON →