Foundry Local Integration for Microsoft Agent Framework

1.0.0b260409 · active · verified Thu Apr 16

agent-framework-foundry-local is a Python package that provides an integration client for the Microsoft Agent Framework to run large language models (LLMs) locally using Foundry Local. It allows developers to build and orchestrate AI agents that leverage local inference capabilities without needing cloud API keys. The package is currently in beta, while the broader Microsoft Agent Framework recently reached version 1.0.0, indicating active development and a rapid release cadence for the ecosystem.

Common errors

Warnings

Install

Imports

Quickstart

This quickstart demonstrates how to create and run a simple AI agent using FoundryLocalClient to connect to a locally hosted LLM via the Foundry Local runtime. Ensure the Foundry Local CLI is installed, the service is running (`foundry service start`), and the specified model (e.g., `phi-4-mini`) is available locally.

import asyncio
from agent_framework import Agent
from agent_framework.foundry import FoundryLocalClient

async def main():
    # Set the default local model (e.g., phi-4-mini, qwen2.5) as an environment variable or pass explicitly
    # Ensure Foundry Local CLI is installed and 'foundry service start' has been run.
    # Optionally, set: os.environ['FOUNDRY_LOCAL_MODEL'] = 'phi-4-mini'

    client = FoundryLocalClient(model='phi-4-mini') # Requires 'phi-4-mini' model to be downloaded via Foundry Local CLI
    agent = Agent(
        client=client,
        name='LocalAssistant',
        instructions='You are a helpful local assistant that runs entirely on my machine.'
    )

    print("Local agent created. Asking a question...")
    result = await agent.run("Tell me a fun fact about Python.")
    print(f"Agent response: {result}")

if __name__ == '__main__':
    asyncio.run(main())

view raw JSON →