{"id":10567,"library":"beeai-framework","title":"BeeAI Framework","description":"The BeeAI Framework is a TypeScript-first library (also available in Python) designed for building production-ready multi-agent systems leveraging large language models (LLMs). It provides a modular architecture encompassing core components like agents, backend functionalities for interacting with various AI models (chat, embedding, tool calling), a flexible prompt templating system, diverse memory types, and a robust tool ecosystem. The current stable TypeScript version is `0.1.28`. The project maintains a rapid release cadence, with frequent minor version updates across both TypeScript and Python branches, often incorporating security fixes and dependency updates to keep pace with the fast-evolving AI landscape. Key differentiators include its explicit focus on production readiness, extensive modularity for customizable agent workflows, and broad support for numerous third-party AI SDKs (such as OpenAI, Anthropic, Google Vertex via `@ai-sdk/*`), vector databases (like Qdrant, Milvus), and other LLM-related tools, as evidenced by its comprehensive list of peer dependencies. The project is currently in 'Beta' status, indicating active development and potential for API evolution, but also a commitment to ongoing feature enhancement and stability improvements.","status":"active","version":"0.1.28","language":"javascript","source_language":"en","source_url":"https://github.com/i-am-bee/beeai-framework","tags":["javascript","BeeAI Framework","LLM Agent Framework","Bee Agent Framework","NodeJS Agent Framework","typescript"],"install":[{"cmd":"npm install beeai-framework","lang":"bash","label":"npm"},{"cmd":"yarn add beeai-framework","lang":"bash","label":"yarn"},{"cmd":"pnpm add beeai-framework","lang":"bash","label":"pnpm"}],"dependencies":[{"reason":"Integration with Agent-to-Agent communication SDK.","package":"@a2a-js/sdk","optional":true},{"reason":"Provider for Amazon Bedrock LLM integration.","package":"@ai-sdk/amazon-bedrock","optional":true},{"reason":"Provider for Anthropic LLM integration.","package":"@ai-sdk/anthropic","optional":true},{"reason":"Provider for Azure OpenAI LLM integration.","package":"@ai-sdk/azure","optional":true},{"reason":"Provider for Google Vertex AI LLM integration.","package":"@ai-sdk/google-vertex","optional":true},{"reason":"Provider for Groq LLM integration.","package":"@ai-sdk/groq","optional":true},{"reason":"Provider for OpenAI LLM integration.","package":"@ai-sdk/openai","optional":true},{"reason":"AWS SDK client for Bedrock runtime interactions.","package":"@aws-sdk/client-bedrock-runtime","optional":true},{"reason":"Client for Elasticsearch interactions, likely for vector stores or search.","package":"@elastic/elasticsearch","optional":true},{"reason":"Google Custom Search API client, for tool integrations.","package":"@googleapis/customsearch","optional":true},{"reason":"Community integrations for LangChain, offering additional tools/components.","package":"@langchain/community","optional":true},{"reason":"Core LangChain functionalities, potentially for compatibility or advanced features.","package":"@langchain/core","optional":true},{"reason":"SDK for Model Context Protocol integration.","package":"@modelcontextprotocol/sdk","optional":true},{"reason":"Client for Qdrant vector database integration.","package":"@qdrant/js-client-rest","optional":true},{"reason":"Client for Milvus vector database integration.","package":"@zilliz/milvus2-sdk-node","optional":true},{"reason":"Web framework, likely for building API backends that host agents.","package":"express","optional":true},{"reason":"Provider for Ollama local LLM integration.","package":"ollama-ai-provider-v2","optional":true},{"reason":"ORM for database interactions, potentially for memory or state persistence.","package":"sequelize","optional":true},{"reason":"YAML parsing/serialization, possibly for configuration or prompt templates.","package":"yaml","optional":true}],"imports":[{"note":"The framework primarily uses ES Modules. CommonJS require() syntax is not supported for core modules.","wrong":"const Agent = require('beeai-framework').Agent","symbol":"Agent","correct":"import { Agent } from 'beeai-framework'"},{"note":"Most core components are named exports directly from the main package. Specific sub-paths are generally not needed for common symbols.","wrong":"import ChatModel from 'beeai-framework/backend'","symbol":"ChatModel","correct":"import { ChatModel } from 'beeai-framework'"},{"note":"As a TypeScript-first library, it's designed for static analysis and ESM usage. Ensure your project is configured for ES Modules.","wrong":"const { Tool } = require('beeai-framework')","symbol":"Tool","correct":"import { Tool } from 'beeai-framework'"},{"note":"The primary interface for chat messages is `Message` within the `backend` module context.","wrong":"import { MessageType } from 'beeai-framework'","symbol":"Message","correct":"import { Message } from 'beeai-framework'"}],"quickstart":{"code":"import \"dotenv/config\";\nimport { Agent, ChatModel, Tool, Message } from 'beeai-framework';\n\n// Define a simple tool that the agent can use\nclass MyCalculatorTool extends Tool {\n  constructor() {\n    super({\n      name: \"calculator\",\n      description: \"A simple calculator tool that can add two numbers.\",\n      parameters: {\n        type: \"object\",\n        properties: {\n          a: { type: \"number\", description: \"The first number\" },\n          b: { type: \"number\", description: \"The second number\" }\n        },\n        required: [\"a\", \"b\"]\n      }\n    });\n  }\n\n  async execute(params: { a: number; b: number }): Promise<any> {\n    return { result: params.a + params.b };\n  }\n}\n\nasync function runAgent() {\n  // Initialize a chat model (e.g., using OpenAI via @ai-sdk/openai peer dep)\n  // Ensure OPENAI_API_KEY is set in your .env file\n  const chatModel = new ChatModel({\n    model: process.env.OPENAI_MODEL ?? 'gpt-4o',\n    provider: 'openai' // This would depend on specific @ai-sdk integration\n  });\n\n  // Instantiate the agent with the chat model and available tools\n  const agent = new Agent({\n    model: chatModel,\n    tools: [new MyCalculatorTool()] // Add our custom tool\n    // Other agent configurations like memory, system prompts etc.\n  });\n\n  console.log(\"Agent initialized. Asking it to perform a calculation...\");\n\n  // Interact with the agent\n  const response = await agent.run([\n    { role: \"user\", content: \"What is 5 plus 3?\" }\n  ]);\n\n  console.log(\"Agent's response:\", response.messages[response.messages.length - 1].content);\n}\n\nrunAgent().catch(console.error);\n","lang":"typescript","description":"Demonstrates initializing an agent with a chat model and a custom tool, then running a simple query for calculation."},"warnings":[{"fix":"Review changelogs carefully for each update, especially for minor version bumps. Consider pinning exact versions in production environments or using dependabot for automated checks.","message":"The BeeAI Framework is currently in 'Beta' status. This indicates that APIs may change, and breaking changes might occur in minor versions as the project evolves towards a stable `1.0` release.","severity":"gotcha","affected_versions":">=0.1.0"},{"fix":"If upgrading to `0.1.27` or later, review your `ChatModel` and other backend-related configurations to align with VercelAI SDK v6's API. Consult the relevant `@ai-sdk/*` package documentation for specific provider changes.","message":"The framework migrated its backend to VercelAI SDK v6, which introduced significant changes to how AI model interactions and configurations are handled.","severity":"breaking","affected_versions":">=0.1.27"},{"fix":"Ensure all peer dependencies are explicitly installed in your project and their versions satisfy the ranges specified by `beeai-framework`. Use `npm install --legacy-peer-deps` or `yarn add --dev` for peer dependencies if necessary, and carefully manage your dependency tree.","message":"The framework has a large number of peer dependencies for various LLM providers, vector stores, and utility libraries. Mismatched versions of these peer dependencies can lead to runtime errors or unexpected behavior.","severity":"gotcha","affected_versions":">=0.1.0"},{"fix":"Always update to the latest available minor version to ensure you benefit from the most recent security patches. Regularly scan your project dependencies for vulnerabilities.","message":"Frequent updates often include fixes for security vulnerabilities (`CVEs`) in underlying dependencies. While beneficial, this rapid pace means older versions might contain unpatched security issues.","severity":"gotcha","affected_versions":"<0.1.28 (TypeScript), <0.1.79 (Python)"}],"env_vars":null,"last_verified":"2026-04-19T00:00:00.000Z","next_check":"2026-07-18T00:00:00.000Z","problems":[{"fix":"Ensure you are using `import { ChatModel } from 'beeai-framework';` and that your `tsconfig.json` (for TypeScript) or `package.json` (for Node.js) is configured for ES Modules (`\"type\": \"module\"`).","cause":"Attempting to use `ChatModel` (or similar core classes) without proper ES Module import or with incorrect CommonJS syntax.","error":"TypeError: chatModel is not a constructor"},{"fix":"Set the required API key (e.g., `OPENAI_API_KEY`, `ANTHROPIC_API_KEY`, `GROQ_API_KEY`) as an environment variable in your `.env` file or directly in your deployment environment.","cause":"An LLM provider (e.g., OpenAI, Anthropic, Groq) was initialized without the necessary API key configured in the environment variables.","error":"Error: Missing API key for OpenAI. Please set the OPENAI_API_KEY environment variable."},{"fix":"Manually inspect the conflicting packages and their required versions. Try to update all related packages to their latest compatible versions or explicitly install peer dependencies to resolve the conflict. Use `npm install --legacy-peer-deps` as a temporary workaround if no direct resolution is found.","cause":"Conflict in peer dependency versions between `beeai-framework` and other packages in your project, or between different versions of its own peer dependencies.","error":"npm ERR! ERESOLVE unable to resolve dependency tree"},{"fix":"Double-check the `execute` method in your custom `Tool` implementation. Ensure it is `async` and always returns a `Promise<any>`. Verify that the `parameters` definition matches the arguments expected by `execute`.","cause":"The `execute` method of a custom `Tool` class was not correctly implemented or is returning a non-promise where a promise is expected, or an argument mismatch occurred.","error":"Error: Tool 'calculator' failed to execute: [...] is not a function"}],"ecosystem":"npm"}