LangChain Azure AI

1.2.2 · active · verified Thu Apr 16

langchain-azure-ai is an integration package that provides first-class support for Azure AI Foundry capabilities within the LangChain and LangGraph ecosystems. It currently ships at version 1.2.2 and maintains an active release cadence with frequent updates and patches.

Common errors

Warnings

Install

Imports

Quickstart

This quickstart demonstrates how to initialize and use the `AzureAIOpenAIApiChatModel` for chat completions. It showcases both API key and Azure Entra ID (via `DefaultAzureCredential`) authentication methods, preferring environment variables for configuration.

import os
from azure.identity import DefaultAzureCredential
from langchain_azure_ai.chat_models import AzureAIOpenAIApiChatModel
from langchain_core.messages import HumanMessage, SystemMessage

# Ensure these environment variables are set
# os.environ["AZURE_AI_PROJECT_ENDPOINT"] = "https://{your-resource-name}.services.ai.azure.com"
# os.environ["AZURE_AI_API_KEY"] = "your-api-key"

# Use DefaultAzureCredential for Entra ID authentication with project_endpoint
# For API key authentication, pass 'credential="your-api-key"' directly
credential = os.environ.get('AZURE_AI_API_KEY', DefaultAzureCredential())

model = AzureAIOpenAIApiChatModel(
    endpoint=os.environ.get('AZURE_AI_PROJECT_ENDPOINT', 'https://your-resource-name.services.ai.azure.com/openai/v1'),
    credential=credential,
    model="gpt-4o" # Or your deployed model name like 'gpt-4o'
)

messages = [
    SystemMessage(content="You are a helpful translator. Translate the following from English into Italian."),
    HumanMessage(content="I love programming.")
]

response = model.invoke(messages)
print(response.content)

view raw JSON →