Azure AI Foundry Integration for Microsoft Agent Framework

1.0.0rc6 · active · verified Sun Apr 12

The `agent-framework-azure-ai` library provides an integration layer for various Azure AI services (e.g., Azure OpenAI, Azure AI Search, Azure Speech, Azure Vision) with the Microsoft Agent Framework. It enables developers to easily incorporate these powerful Azure capabilities into their agent-based applications. Currently at version `1.0.0rc6`, its release cadence is tied to the evolving Microsoft Agent Framework and Azure AI services, with new release candidates emerging as features stabilize.

Warnings

Install

Imports

Quickstart

This quickstart demonstrates how to initialize the `AzureOpenAIService` provided by the library and integrate it as an LLM with the Microsoft Agent Framework. It requires proper Azure OpenAI service credentials set as environment variables.

import os
from agent_framework.core.agent import Agent
from agent_framework.core.model import ModelConfig
from agent_framework_azure_ai.azure_open_ai_service import AzureOpenAIService

# Ensure environment variables are set for Azure OpenAI
azure_openai_endpoint = os.environ.get("AZURE_OPENAI_ENDPOINT", "https://YOUR_AZURE_OPENAI_ENDPOINT.openai.azure.com/")
azure_openai_api_key = os.environ.get("AZURE_OPENAI_API_KEY", "YOUR_AZURE_OPENAI_API_KEY")
azure_openai_deployment_name = os.environ.get("AZURE_OPENAI_DEPLOYMENT_NAME", "YOUR_DEPLOYMENT_NAME")

if "YOUR_AZURE_OPENAI" in azure_openai_endpoint or "YOUR_AZURE_OPENAI" in azure_openai_api_key or "YOUR_DEPLOYMENT_NAME" in azure_openai_deployment_name:
    print("Please set AZURE_OPENAI_ENDPOINT, AZURE_OPENAI_API_KEY, and AZURE_OPENAI_DEPLOYMENT_NAME environment variables.")
    print("Example: export AZURE_OPENAI_ENDPOINT='https://your-resource.openai.azure.com/'")
    print("Example: export AZURE_OPENAI_API_KEY='...' ")
    print("Example: export AZURE_OPENAI_DEPLOYMENT_NAME='gpt-4o' ")
    exit(1)

# 1. Initialize the Azure OpenAI Service as an LLM
azure_llm_service = AzureOpenAIService(
    azure_endpoint=azure_openai_endpoint,
    api_key=azure_openai_api_key,
    deployment_name=azure_openai_deployment_name,
    model_config=ModelConfig(max_tokens=256) # Optional: Configure LLM behavior
)

# 2. Create an Agent and provide it the Azure LLM
agent = Agent(name="AzureAIAgent").with_llm(azure_llm_service)

# 3. Use the agent to get a response
print(f"Agent '{agent.name}' is ready to chat...")
response = agent.chat("What is the capital of France?")
print(f"Agent response: {response.output}")

# Example with a simple tool (if using other Azure AI services as tools)
# from agent_framework.core.tool import Tool
# class MyAzureAISearchTool(Tool):
#     def call(self, query: str) -> str:
#         # Placeholder for actual Azure AISearchService logic
#         return f"Searching Azure AI Search for: {query}"
# agent.with_tool(MyAzureAISearchTool())
# response_with_tool = agent.chat("Find documents about renewable energy.")

view raw JSON →