LangChain AWS Integrations
langchain-aws provides a robust integration layer connecting AWS services like Amazon Bedrock, Amazon SageMaker, and Amazon Comprehend with the LangChain framework. It simplifies access to AWS-powered LLMs, embeddings, and vector stores within LangChain applications. The library is actively maintained with frequent releases, often bi-weekly or monthly, to support new AWS features and LangChain capabilities. The current version is 1.4.3.
Warnings
- breaking The `langchain-aws` library was split from the main `langchain` package in late 2023 / early 2024. All AWS-specific integrations (e.g., Bedrock, SageMaker) now reside in `langchain-aws` and require explicit installation and updated import paths.
- gotcha Proper configuration of AWS credentials and region is essential. `langchain-aws` relies on `boto3`'s default credential chain. Misconfiguration can lead to `ClientError` (e.g., 'NotAuthorizedException', 'UnrecognizedClientException').
- gotcha Bedrock model IDs are specific and may vary by AWS region or API availability. Using an incorrect or unavailable model ID will result in runtime errors from Bedrock.
- gotcha LangChain differentiates between chat models (`ChatBedrock`) and text completion models (`BedrockLLM`). Their interfaces and expected input/output formats (e.g., `HumanMessage` for chat, string for LLM) differ. Using the wrong class or input format can cause unexpected behavior or errors.
Install
-
pip install langchain-aws
Imports
- ChatBedrock
from langchain_aws.chat_models import ChatBedrock
- BedrockLLM
from langchain_aws.llms import BedrockLLM
- BedrockEmbeddings
from langchain_aws.embeddings import BedrockEmbeddings
Quickstart
import os
from langchain_aws.chat_models import ChatBedrock
from langchain_core.messages import HumanMessage
# Ensure AWS_REGION_NAME environment variable is set
# and AWS credentials are configured (e.g., via ~/.aws/credentials or env vars)
region = os.environ.get("AWS_REGION_NAME", "us-east-1")
model_id = "anthropic.claude-3-sonnet-20240229-v1:0" # Example Bedrock model ID
try:
llm = ChatBedrock(
model_id=model_id,
region_name=region
)
messages = [
HumanMessage(
content="Tell me a short story about a brave knight and a wise dragon."
)
]
print(f"Invoking {model_id} in {region}...")
response = llm.invoke(messages)
print("\n--- Response ---")
print(response.content)
except Exception as e:
print(f"An error occurred: {e}")
print("Please ensure AWS_REGION_NAME is set and AWS credentials are configured.")
print("Example: export AWS_REGION_NAME=us-east-1")
print("See https://boto3.amazonaws.com/v1/documentation/api/latest/guide/credentials.html")