BeeAI Framework
The BeeAI Framework is a TypeScript-first library (also available in Python) designed for building production-ready multi-agent systems leveraging large language models (LLMs). It provides a modular architecture encompassing core components like agents, backend functionalities for interacting with various AI models (chat, embedding, tool calling), a flexible prompt templating system, diverse memory types, and a robust tool ecosystem. The current stable TypeScript version is `0.1.28`. The project maintains a rapid release cadence, with frequent minor version updates across both TypeScript and Python branches, often incorporating security fixes and dependency updates to keep pace with the fast-evolving AI landscape. Key differentiators include its explicit focus on production readiness, extensive modularity for customizable agent workflows, and broad support for numerous third-party AI SDKs (such as OpenAI, Anthropic, Google Vertex via `@ai-sdk/*`), vector databases (like Qdrant, Milvus), and other LLM-related tools, as evidenced by its comprehensive list of peer dependencies. The project is currently in 'Beta' status, indicating active development and potential for API evolution, but also a commitment to ongoing feature enhancement and stability improvements.
Common errors
-
TypeError: chatModel is not a constructor
cause Attempting to use `ChatModel` (or similar core classes) without proper ES Module import or with incorrect CommonJS syntax.fixEnsure you are using `import { ChatModel } from 'beeai-framework';` and that your `tsconfig.json` (for TypeScript) or `package.json` (for Node.js) is configured for ES Modules (`"type": "module"`). -
Error: Missing API key for OpenAI. Please set the OPENAI_API_KEY environment variable.
cause An LLM provider (e.g., OpenAI, Anthropic, Groq) was initialized without the necessary API key configured in the environment variables.fixSet the required API key (e.g., `OPENAI_API_KEY`, `ANTHROPIC_API_KEY`, `GROQ_API_KEY`) as an environment variable in your `.env` file or directly in your deployment environment. -
npm ERR! ERESOLVE unable to resolve dependency tree
cause Conflict in peer dependency versions between `beeai-framework` and other packages in your project, or between different versions of its own peer dependencies.fixManually inspect the conflicting packages and their required versions. Try to update all related packages to their latest compatible versions or explicitly install peer dependencies to resolve the conflict. Use `npm install --legacy-peer-deps` as a temporary workaround if no direct resolution is found. -
Error: Tool 'calculator' failed to execute: [...] is not a function
cause The `execute` method of a custom `Tool` class was not correctly implemented or is returning a non-promise where a promise is expected, or an argument mismatch occurred.fixDouble-check the `execute` method in your custom `Tool` implementation. Ensure it is `async` and always returns a `Promise<any>`. Verify that the `parameters` definition matches the arguments expected by `execute`.
Warnings
- gotcha The BeeAI Framework is currently in 'Beta' status. This indicates that APIs may change, and breaking changes might occur in minor versions as the project evolves towards a stable `1.0` release.
- breaking The framework migrated its backend to VercelAI SDK v6, which introduced significant changes to how AI model interactions and configurations are handled.
- gotcha The framework has a large number of peer dependencies for various LLM providers, vector stores, and utility libraries. Mismatched versions of these peer dependencies can lead to runtime errors or unexpected behavior.
- gotcha Frequent updates often include fixes for security vulnerabilities (`CVEs`) in underlying dependencies. While beneficial, this rapid pace means older versions might contain unpatched security issues.
Install
-
npm install beeai-framework -
yarn add beeai-framework -
pnpm add beeai-framework
Imports
- Agent
const Agent = require('beeai-framework').Agentimport { Agent } from 'beeai-framework' - ChatModel
import ChatModel from 'beeai-framework/backend'
import { ChatModel } from 'beeai-framework' - Tool
const { Tool } = require('beeai-framework')import { Tool } from 'beeai-framework' - Message
import { MessageType } from 'beeai-framework'import { Message } from 'beeai-framework'
Quickstart
import "dotenv/config";
import { Agent, ChatModel, Tool, Message } from 'beeai-framework';
// Define a simple tool that the agent can use
class MyCalculatorTool extends Tool {
constructor() {
super({
name: "calculator",
description: "A simple calculator tool that can add two numbers.",
parameters: {
type: "object",
properties: {
a: { type: "number", description: "The first number" },
b: { type: "number", description: "The second number" }
},
required: ["a", "b"]
}
});
}
async execute(params: { a: number; b: number }): Promise<any> {
return { result: params.a + params.b };
}
}
async function runAgent() {
// Initialize a chat model (e.g., using OpenAI via @ai-sdk/openai peer dep)
// Ensure OPENAI_API_KEY is set in your .env file
const chatModel = new ChatModel({
model: process.env.OPENAI_MODEL ?? 'gpt-4o',
provider: 'openai' // This would depend on specific @ai-sdk integration
});
// Instantiate the agent with the chat model and available tools
const agent = new Agent({
model: chatModel,
tools: [new MyCalculatorTool()] // Add our custom tool
// Other agent configurations like memory, system prompts etc.
});
console.log("Agent initialized. Asking it to perform a calculation...");
// Interact with the agent
const response = await agent.run([
{ role: "user", content: "What is 5 plus 3?" }
]);
console.log("Agent's response:", response.messages[response.messages.length - 1].content);
}
runAgent().catch(console.error);