Azure AI Agent Server Core
Azure AI Agent Server Core (azure-ai-agentserver-core) provides foundational utilities and a host framework for deploying Azure AI Hosted Agents. It enables developers to host their agents as containers in the cloud, allowing interaction via the `azure-ai-projects` SDK. The library is part of the broader Azure SDK for Python, with a current version of 2.0.0b1, reflecting active development and frequent beta releases within the ecosystem.
Common errors
-
ModuleNotFoundError: No module named 'agent_framework.azure'
cause Attempting to import or use an old `agent_framework` pattern (e.g., `AzureAIProjectAgentProvider`) which was part of a previous architectural iteration and is no longer included in the leaner core.fixRefactor your agent code to use the `FoundryCBAgent` from `azure.ai.agentserver.core` and follow the current patterns for defining and hosting agents, which assume agents are pre-configured in Azure AI Foundry. Consult the latest quickstart and migration guides. -
Agent fails to run locally or on cloud with generic configuration errors or authentication failures.
cause Incorrect or missing environment variables for Azure AI Foundry project endpoint, authentication credentials (e.g., `AZURE_CLIENT_ID`, `AZURE_TENANT_ID`, `AZURE_CLIENT_SECRET`), or insufficient role-based access control (RBAC) permissions for the agent's identity in Azure.fixVerify that all necessary environment variables (e.g., `AZURE_AI_PROJECT_ENDPOINT`) are correctly set. Ensure the identity running the agent (local developer or deployed service principal/managed identity) has the required 'Azure AI Developer' or custom equivalent roles on the Azure AI Foundry project resource.
Warnings
- breaking The Agent Framework, which `azure-ai-agentserver-core` is part of, underwent a significant architectural shift leading up to its 1.0.0 release. Provider-specific implementations (e.g., for OpenAI or Azure AI Projects directly creating agents) were moved out of the core package into dedicated, lightweight packages.
- gotcha This library (`azure-ai-agentserver-core`) is primarily a *host framework* for your agent's logic. To interact with or manage agents hosted in Azure AI Foundry (e.g., sending messages, creating threads), you typically use the `azure-ai-projects` client library or `azure-ai-agents` SDK.
- gotcha As a beta package (indicated by `b1` in the version), the API of `azure-ai-agentserver-core` is subject to change. Breaking changes may occur in future beta or stable releases without prior deprecation warnings typical of stable versions.
Install
-
pip install azure-ai-agentserver-core
Imports
- FoundryCBAgent
from azure.ai.agentserver.core import FoundryCBAgent
- CreateResponse
from azure.ai.agentserver.core.models import CreateResponse
- Response as OpenAIResponse
from azure.ai.agentserver.core.models import Response as OpenAIResponse
- ItemContentOutputText
from azure.ai.agentserver.core.models.projects import ItemContentOutputText
- ResponsesAssistantMessageItemResource
from azure.ai.agentserver.core.models.projects import ResponsesAssistantMessageItemResource
- ResponseTextDeltaEvent
from azure.ai.agentserver.core.models.projects import ResponseTextDeltaEvent
- ResponseTextDoneEvent
from azure.ai.agentserver.core.models.projects import ResponseTextDoneEvent
Quickstart
import os
import datetime
from azure.ai.agentserver.core import FoundryCBAgent
from azure.ai.agentserver.core.models import (
CreateResponse,
Response as OpenAIResponse,
)
from azure.ai.agentserver.core.models.projects import (
ItemContentOutputText,
ResponsesAssistantMessageItemResource,
ResponseTextDeltaEvent,
ResponseTextDoneEvent,
)
def stream_events(text: str):
assembled = ""
for i, token in enumerate(text.split(" ")):
piece = token if i == len(text.split(" ")) - 1 else token + " "
assembled += piece
yield ResponseTextDeltaEvent(delta=piece)
yield ResponseTextDoneEvent(text=assembled)
async def agent_run(request_body: CreateResponse):
print(f"Agent request received: {request_body.agent}")
if request_body.stream:
return stream_events("I am a mock agent with no intelligence in stream mode.")
# Build assistant output content
output_content = [
ItemContentOutputText(
text="I am a mock agent with no intelligence, responding to your request.",
annotations=[],
)
]
response = OpenAIResponse(
metadata={},
temperature=0.0,
top_p=0.0,
user="mock_user",
id="mock_id",
created_at=datetime.datetime.now(),
output=[
ResponsesAssistantMessageItemResource(
status="completed",
content=output_content,
)
],
)
return response
my_agent = FoundryCBAgent()
my_agent.agent_run = agent_run
if __name__ == "__main__":
# Ensure environment variables are set for actual deployment
# For local testing, you might not need all env vars for simple mock agents.
# However, for cloud deployment, you'd need specifics like AZURE_AI_PROJECT_ENDPOINT
# os.environ['AZURE_AI_PROJECT_ENDPOINT'] = 'your_project_endpoint'
# os.environ['AZURE_CLIENT_ID'] = 'your_client_id'
# os.environ['AZURE_TENANT_ID'] = 'your_tenant_id'
# os.environ['AZURE_CLIENT_SECRET'] = 'your_client_secret'
print("Starting Azure AI Agent Server Core mock agent...")
my_agent.run()