LangChain Azure AI
langchain-azure-ai is an integration package that provides first-class support for Azure AI Foundry capabilities within the LangChain and LangGraph ecosystems. It currently ships at version 1.2.2 and maintains an active release cadence with frequent updates and patches.
Common errors
-
ModuleNotFoundError: No module named 'langchain_community'
cause Some Azure-related components, particularly document loaders like `AzureAIDocumentIntelligenceLoader`, were moved to `langchain-community`.fixInstall the `langchain-community` package: `pip install langchain-community`. Update your import statements to `from langchain_community.document_loaders import AzureAIDocumentIntelligenceLoader`. -
ClientAuthenticationError: (None) Unauthorized. Access token is missing, invalid, audience is incorrect...
cause Incorrect or expired Azure credentials, or the assigned identity lacks the necessary permissions (e.g., 'Azure OpenAI User' or 'Cognitive Services User' role).fixVerify your Azure CLI login (`az login`) and ensure `DefaultAzureCredential` is correctly configured. Check that the service principal or user identity has the required Azure RBAC roles on your Azure AI resource. Also, confirm the `endpoint` and `api_version` are correct. -
openai.error.InvalidRequestError: Resource not found
cause The Azure OpenAI endpoint or model deployment name is incorrect or missing. This can also happen if `project_endpoint` is used incorrectly or includes extra paths like `/models/chat/completions`.fixDouble-check your `endpoint` URL and `model` name against your Azure AI Studio deployments. Ensure the endpoint does not include extraneous paths. If using `project_endpoint`, only provide the root endpoint (e.g., `https://<project-name>.services.ai.azure.com/`). -
ValueError: Unsupported response_format {'type': 'json_schema', 'json_schema': {...}}cause The `AzureAIChatCompletionsModel` (from the deprecated inference SDK) does not fully support OpenAI-style JSON schema structured output.fixMigrate to `langchain_azure_ai.chat_models.AzureAIOpenAIApiChatModel` which offers better support for OpenAI-compatible APIs and structured outputs. Alternatively, use `langchain-openai` which treats Azure-compatible endpoints as OpenAI endpoints for structured output features.
Warnings
- breaking The `AzureAIChatCompletionsModel` and `AzureAIEmbeddingsModel` classes (using Azure AI Inference SDK) have been deprecated in favor of OpenAI-compatible APIs. They now reside under the `inference` sub-namespace and require the `langchain-azure-ai[v1]` extra.
- breaking The `project_connection_string` parameter for creating `AzureAIEmbeddingsModel` and `AzureAIChatCompletionsModel` is no longer supported. Use `project_endpoint` instead.
- deprecated The `AzureAIInferenceTracer` class has been removed. It is replaced by `AzureAIOpenTelemetryTracer` for better OpenTelemetry support and new GenAI semantic conventions.
- deprecated Creating agents using Foundry Agents V1 (e.g., `AgentServiceFactory`) has been deprecated in favor of V2 implementation.
- gotcha Using `init_chat_model("azure_ai:...")` convenience method requires `langchain>=1.2.13` for full functionality, especially streaming implementations.
- gotcha Authentication with a `project_endpoint` (set via `AZURE_AI_PROJECT_ENDPOINT` environment variable) strictly requires Microsoft Entra ID for authentication and the 'Azure AI User' role on the project.
Install
-
pip install -U langchain-azure-ai -
pip install -U "langchain-azure-ai[tools]" -
pip install -U "langchain-azure-ai[opentelemetry]" -
pip install -U "langchain-azure-ai[v1]"
Imports
- AzureAIOpenAIApiChatModel
from langchain_azure_ai.chat_models import AzureAIOpenAIApiChatModel
- AzureAIChatCompletionsModel
from langchain_azure_ai.chat_models import AzureAIChatCompletionsModel
from langchain_azure_ai.chat_models.inference import AzureAIChatCompletionsModel
- DefaultAzureCredential
from azure.identity import DefaultAzureCredential
Quickstart
import os
from azure.identity import DefaultAzureCredential
from langchain_azure_ai.chat_models import AzureAIOpenAIApiChatModel
from langchain_core.messages import HumanMessage, SystemMessage
# Ensure these environment variables are set
# os.environ["AZURE_AI_PROJECT_ENDPOINT"] = "https://{your-resource-name}.services.ai.azure.com"
# os.environ["AZURE_AI_API_KEY"] = "your-api-key"
# Use DefaultAzureCredential for Entra ID authentication with project_endpoint
# For API key authentication, pass 'credential="your-api-key"' directly
credential = os.environ.get('AZURE_AI_API_KEY', DefaultAzureCredential())
model = AzureAIOpenAIApiChatModel(
endpoint=os.environ.get('AZURE_AI_PROJECT_ENDPOINT', 'https://your-resource-name.services.ai.azure.com/openai/v1'),
credential=credential,
model="gpt-4o" # Or your deployed model name like 'gpt-4o'
)
messages = [
SystemMessage(content="You are a helpful translator. Translate the following from English into Italian."),
HumanMessage(content="I love programming.")
]
response = model.invoke(messages)
print(response.content)