{"id":6957,"library":"agent-framework-ollama","title":"Ollama Integration for Microsoft Agent Framework","description":"agent-framework-ollama provides an adapter to integrate Ollama language models with the Microsoft Agent Framework. It enables agents to utilize local or remote Ollama instances for generating responses and processing natural language. As a beta package (current version 1.0.0b260409), its API is subject to frequent changes, and it is part of an active development effort within the broader Microsoft Agent Framework initiative.","status":"active","version":"1.0.0b260409","language":"en","source_language":"en","source_url":"https://github.com/microsoft/agent-framework/tree/main/python/packages/ollama","tags":["LLM","agent","ollama","microsoft","AI","framework"],"install":[{"cmd":"pip install agent-framework-ollama","lang":"bash","label":"Install latest beta"}],"dependencies":[{"reason":"This library provides an adapter for the core agent_framework. It is a mandatory dependency for creating and running agents that use Ollama.","package":"agent-framework"}],"imports":[{"symbol":"OllamaAdapter","correct":"from agent_framework_ollama import OllamaAdapter"}],"quickstart":{"code":"import asyncio\nimport os\nfrom agent_framework.agent import Agent\nfrom agent_framework.user import User\nfrom agent_framework_ollama import OllamaAdapter\n\nasync def main():\n    # Pre-requisites:\n    # 1. Ollama server must be running (e.g., `ollama serve` in your terminal)\n    # 2. The desired model must be pulled (e.g., `ollama pull llama3`)\n\n    # Configure Ollama connection using environment variables or defaults\n    ollama_base_url = os.environ.get(\"OLLAMA_BASE_URL\", \"http://localhost:11434\")\n    ollama_model = os.environ.get(\"OLLAMA_MODEL\", \"llama3\")\n\n    if not ollama_model:\n        print(\"Error: OLLAMA_MODEL environment variable not set, or no default provided.\")\n        print(\"Please specify an Ollama model, e.g., 'llama3'.\")\n        return\n\n    print(f\"Attempting to connect to Ollama at {ollama_base_url} with model: {ollama_model}\")\n\n    try:\n        # Initialize the OllamaAdapter\n        ollama_adapter = OllamaAdapter(\n            model=ollama_model,\n            base_url=ollama_base_url,\n        )\n\n        # Create an Agent using the OllamaAdapter\n        agent = Agent(\n            name=\"OllamaAgent\",\n            adapter=ollama_adapter,\n            # Optional: Add hooks to log messages as they are processed\n            pre_process_messages=lambda msgs: print(f\"\\n[OllamaAgent received]: {[m.content for m in msgs if m.content]}\"),\n            post_process_messages=lambda msgs: print(f\"[OllamaAgent sent]: {[m.content for m in msgs if m.content]}\")\n        )\n\n        # Create a User to interact with the agent\n        user = User(\n            name=\"User\",\n            connect_to=agent\n        )\n\n        print(\"\\n--- Starting conversation with OllamaAgent ---\")\n        await user.send(\"Hello, OllamaAgent! Please introduce yourself and tell me what you can do.\")\n        # In a more complex application, you'd manage message history or await a specific response.\n        # For this quickstart, we observe the interaction via the print hooks.\n        print(\"\\n--- Conversation complete ---\")\n\n    except Exception as e:\n        print(f\"\\nAn error occurred during agent execution: {e}\")\n        print(\"Please ensure your Ollama server is running, the model is pulled, and the base_url is correct.\")\n\nif __name__ == \"__main__\":\n    asyncio.run(main())\n","lang":"python","description":"This quickstart demonstrates how to set up an `OllamaAdapter` and integrate it with an `Agent` from the `agent-framework`. It assumes an Ollama server is running locally (or at `OLLAMA_BASE_URL`) and the specified model (default: `llama3`) is available. The interaction flow is logged to the console via `pre_process_messages` and `post_process_messages` hooks, providing visibility into the agent's communication."},"warnings":[{"fix":"Regularly check the official GitHub repository's `README.md` and release notes for breaking changes upon upgrade. Pin your dependency versions to specific beta releases (e.g., `agent-framework-ollama==1.0.0b260409`) if stability is critical.","message":"As a beta package, the API of agent-framework-ollama is subject to frequent changes without strict adherence to semantic versioning. Methods, classes, and configurations may be renamed, altered, or removed in minor or even patch versions.","severity":"breaking","affected_versions":"All beta versions (1.0.0b*)"},{"fix":"Verify that your Ollama server is active and accessible at the `base_url` configured (default: `http://localhost:11434`). Ensure the specified `model` (e.g., `llama3`) has been pulled using `ollama pull <model_name>` via the Ollama CLI.","message":"The functionality of agent-framework-ollama relies on a running Ollama server instance and pre-downloaded models. Network connectivity issues, an incorrect `base_url`, or missing models are common causes of runtime errors.","severity":"gotcha","affected_versions":"All versions"},{"fix":"When upgrading, it's often best practice to upgrade both `agent-framework` and `agent-framework-ollama` simultaneously. Consult the `agent-framework` repository and release notes for any specific dependency requirements or version compatibility matrices.","message":"This library is tightly coupled with `agent-framework`, which is also in beta. Incompatibility issues can arise if `agent-framework-ollama` and `agent-framework` are not compatible versions, especially if `agent-framework` itself introduces breaking changes or requires specific underlying dependencies.","severity":"gotcha","affected_versions":"All beta versions"}],"env_vars":null,"last_verified":"2026-04-16T00:00:00.000Z","next_check":"2026-07-15T00:00:00.000Z","problems":[{"fix":"Install the package using pip: `pip install agent-framework-ollama`","cause":"The 'agent-framework-ollama' package has not been installed in your Python environment.","error":"ModuleNotFoundError: No module named 'agent_framework_ollama'"},{"fix":"Start your Ollama server (e.g., `ollama serve`). Verify the `base_url` parameter in `OllamaAdapter` (or `OLLAMA_BASE_URL` env var) matches your Ollama instance, and check for any firewall restrictions.","cause":"The Ollama server is not running, is not accessible at the specified `base_url`, or is blocked by a firewall.","error":"httpx.ConnectError: Connection refused"},{"fix":"Pull the required model using the Ollama CLI: `ollama pull your-model-name` (replace 'your-model-name' with the actual model you intend to use, e.g., 'llama3'). Ensure the model name in your code matches the pulled model.","cause":"The specified Ollama model is not available on your running Ollama server or its name is misspelled. The server returned an error indicating the model needs to be pulled.","error":"ollama.ResponseError: {'error': \"model 'your-model-name' not found, try `ollama pull your-model-name`\"}"}]}