AG2: Open-Source AgentOS for AI Agents
AG2 (formerly AutoGen) is an open-source programming framework for building AI agents and facilitating cooperation among multiple agents to solve tasks. It provides fundamental building blocks to create, deploy, and manage AI agents, supporting various LLMs, tool use, autonomous and human-in-the-loop workflows, and multi-agent conversation patterns. The current version is 0.11.5 and it maintains a rapid release cadence with frequent updates and new features.
Warnings
- breaking AG2 is transitioning from its original framework (autogen.agentchat) to a new, redesigned beta framework (autogen.beta) which will become the official v1.0. This will involve deprecations and architectural changes in upcoming minor versions (v0.12, v0.13, v0.14) before the beta becomes stable at v1.0. Users should plan for migration.
- deprecated The `GPTAssistantAgent` class is deprecated as of v0.12 and will be removed in v0.14. Similarly, the `Swarm` orchestration pattern (and related functions like `initiate_swarm_chat()`) has been deprecated since v0.9 in favor of the new `GroupChat` pattern.
- gotcha LLM provider packages are not installed by default with `pip install ag2`. Users must explicitly install them as extras (e.g., `pip install "ag2[openai]"`) for their chosen LLM provider to function.
- gotcha The `LLMConfig` object uses `deepcopy` internally to prevent unintended modifications. If `llm_config` contains custom objects that do not implement a `__deepcopy__` method, it can lead to `TypeError`.
- security As of v0.11.4, `ShellExecutor` now uses `shell=False` with `shlex.split` to prevent shell command injection vulnerabilities. Previously, users might have inadvertently created insecure execution environments.
Install
-
pip install "ag2[openai]" -
pip install "ag2[gemini,anthropic]" -
pip install ag2
Imports
- ConversableAgent
from autogen import ConversableAgent
- LLMConfig
from autogen import LLMConfig
- GroupChat
from autogen import GroupChat
- GroupChatManager
from autogen import GroupChatManager
- Agent (beta)
from autogen.beta import Agent
Quickstart
import os
from autogen import ConversableAgent, LLMConfig
# Ensure your OpenAI API key is set as an environment variable
# For example: export OPENAI_API_KEY="YOUR_API_KEY"
openai_api_key = os.environ.get("OPENAI_API_KEY", "")
if not openai_api_key:
print("Error: OPENAI_API_KEY environment variable is not set.")
exit()
llm_config = LLMConfig(
{
"api_type": "openai",
"model": "gpt-5-nano",
"api_key": openai_api_key
}
)
# Create a poetic AI assistant
my_agent = ConversableAgent(
name="helpful_agent",
system_message="You are a poetic AI assistant, respond in rhyme.",
llm_config=llm_config,
)
# Run the agent with a prompt
response = my_agent.run(
message="In one sentence, what's the big deal about AI?",
max_turns=1, # Limit turns for a quick, non-interactive example
user_input=False, # Disable human input for automatic execution
)
# Print the agent's final response
if response.chat_history:
print(response.chat_history[-1]["content"])
else:
print("No response generated.")