Azure AI Foundry Integration for Microsoft Agent Framework
The `agent-framework-azure-ai` library provides an integration layer for various Azure AI services (e.g., Azure OpenAI, Azure AI Search, Azure Speech, Azure Vision) with the Microsoft Agent Framework. It enables developers to easily incorporate these powerful Azure capabilities into their agent-based applications. Currently at version `1.0.0rc6`, its release cadence is tied to the evolving Microsoft Agent Framework and Azure AI services, with new release candidates emerging as features stabilize.
Warnings
- breaking This library is currently in Release Candidate (RC) status (`1.0.0rc6`). APIs, module paths, and behaviors are subject to significant change without backward compatibility guarantees until a stable `1.0.0` release. Users should expect frequent updates and potential breaking changes.
- gotcha This library is an extension for the `agent-framework`. It *requires* `agent-framework` to be installed and understood. Core concepts like `Agent`, `LLM`, `Tool` come from the parent framework, not this library.
- gotcha Proper configuration of Azure AI service credentials (endpoint, API key, deployment name, etc.) via environment variables or explicit parameters is critical. Misconfigurations often lead to authentication errors or incorrect service routing.
- breaking The library explicitly pins `pydantic<2` in its `install_requires`. Using `pydantic>=2` in your project alongside this library will cause dependency conflicts and likely runtime errors.
Install
-
pip install agent-framework-azure-ai
Imports
- AzureOpenAIService
from agent_framework_azure_ai.azure_open_ai_service import AzureOpenAIService
- AzureAISearchService
from agent_framework_azure_ai.ai_search_service import AzureAISearchService
- AzureSpeechService
from agent_framework_azure_ai.azure_speech_service import AzureSpeechService
- AzureVisionService
from agent_framework_azure_ai.azure_vision_service import AzureVisionService
- Agent
from agent_framework.core.agent import Agent
Quickstart
import os
from agent_framework.core.agent import Agent
from agent_framework.core.model import ModelConfig
from agent_framework_azure_ai.azure_open_ai_service import AzureOpenAIService
# Ensure environment variables are set for Azure OpenAI
azure_openai_endpoint = os.environ.get("AZURE_OPENAI_ENDPOINT", "https://YOUR_AZURE_OPENAI_ENDPOINT.openai.azure.com/")
azure_openai_api_key = os.environ.get("AZURE_OPENAI_API_KEY", "YOUR_AZURE_OPENAI_API_KEY")
azure_openai_deployment_name = os.environ.get("AZURE_OPENAI_DEPLOYMENT_NAME", "YOUR_DEPLOYMENT_NAME")
if "YOUR_AZURE_OPENAI" in azure_openai_endpoint or "YOUR_AZURE_OPENAI" in azure_openai_api_key or "YOUR_DEPLOYMENT_NAME" in azure_openai_deployment_name:
print("Please set AZURE_OPENAI_ENDPOINT, AZURE_OPENAI_API_KEY, and AZURE_OPENAI_DEPLOYMENT_NAME environment variables.")
print("Example: export AZURE_OPENAI_ENDPOINT='https://your-resource.openai.azure.com/'")
print("Example: export AZURE_OPENAI_API_KEY='...' ")
print("Example: export AZURE_OPENAI_DEPLOYMENT_NAME='gpt-4o' ")
exit(1)
# 1. Initialize the Azure OpenAI Service as an LLM
azure_llm_service = AzureOpenAIService(
azure_endpoint=azure_openai_endpoint,
api_key=azure_openai_api_key,
deployment_name=azure_openai_deployment_name,
model_config=ModelConfig(max_tokens=256) # Optional: Configure LLM behavior
)
# 2. Create an Agent and provide it the Azure LLM
agent = Agent(name="AzureAIAgent").with_llm(azure_llm_service)
# 3. Use the agent to get a response
print(f"Agent '{agent.name}' is ready to chat...")
response = agent.chat("What is the capital of France?")
print(f"Agent response: {response.output}")
# Example with a simple tool (if using other Azure AI services as tools)
# from agent_framework.core.tool import Tool
# class MyAzureAISearchTool(Tool):
# def call(self, query: str) -> str:
# # Placeholder for actual Azure AISearchService logic
# return f"Searching Azure AI Search for: {query}"
# agent.with_tool(MyAzureAISearchTool())
# response_with_tool = agent.chat("Find documents about renewable energy.")