local_fire_departmentHoneystax
search⌘K
loginLog Inperson_addSign Up
layers
HONEYSTAX TERMINAL v1.0
HomeNewsSavedSubmit
Back to the live board
B

beeai-framework

Agent

Build production-ready AI agents in both Python and Typescript.

Copy the install, test the workflow, then decide if it earns a permanent slot.

3,229
Why nowMoving now

Fresh repo activity plus visible builder pull. This is the kind of tool people test before it turns obvious.

DecisionHigh-conviction move

Copy the install, test the workflow, then decide if it earns a permanent slot.

Trial costMedium lift

Reasonable to try, but it will take more than a quick skim to get real signal.

Risk14/100

GitHub health 87/100. no security policy. Fresh enough repo health and manageable issue load keep the risk controlled.

What You Are Adopting

AI Agent

Universal

Model

Multiple

Build Time

Minutes

Test This In Your Stack

One command inClean rollbackLow commitment
shieldSandboxedInstalls to ~/.claude — isolated from your projects. One command to remove.

Fastest way to find out if beeai-framework belongs in your setup.

Copy the install command, run a real test, and back it out cleanly if it slows you down.

Try now
git clone https://github.com/i-am-bee/beeai-framework ~/.claude/agents/beeai-framework

Run this first. You will know quickly if the workflow earns a permanent slot.

Back out
rm -rf ~/.claude/agents/beeai-framework

No messy cleanup loop. If it misses, remove it and keep moving.

Install Location

~/  └─ .claude/      ├─ commands/      ├─ agents/      │   └─ beeai-framework/ ← installs here      └─ settings.json

About

Build production-ready AI agents in both Python and Typescript.. An open-source agent for the AI coding ecosystem.

README

BeeAI Framework

Build production-ready multi-agent systems in Python or TypeScript.

Documentation Python library Typescript library Apache 2.0 Join our Discord LF AI & Data Follow on Bluesky

Latest updates

Date Language Update Description
2025/08/25 Python 🚀 ACP is now part of A2A under the Linux Foundation! 👉 Learn more
2025/06/03 Python Release experimental Requirement Agent.
2025/05/15 Python New protocol integrations: ACP and MCP.
2025/02/19 Python Launched Python library alpha. See getting started guide.
2025/02/07 TypeScript Introduced Backend module to simplify working with AI services (chat, embedding).
2025/01/28 TypeScript Added support for DeepSeek R1, check out the Competitive Analysis Workflow example.
2025/01/09 TypeScript Introduced Workflows, a way of building multi-agent systems. Added support for Model Context Protocol.
2024/12/09 TypeScript Added support for LLaMa 3.3. See multi-agent workflow example using watsonx or explore other available providers.
2024/11/21 TypeScript Added an experimental Streamlit agent.

For a full changelog, see our releases page.


What is BeeAI Framework?

BeeAI Framework is a comprehensive toolkit for building intelligent, autonomous agents and multi-agent systems. It provides everything you need to create agents that can reason, take actions, and collaborate to solve complex problems.

Tip

Get started quickly with the beeai-framework-py-starter [Python] or beeai-framework-ts-starter [TypeScript] template.

Key Features

Feature Description
🤖 Requirement Agent Create predictable, controlled behavior across different LLMs by setting rules the agent must follow.
🤖 Agents Create intelligent agents that can reason, act, and adapt
🔌 Backend Connect to any LLM provider with unified interfaces
🔧 Tools Extend agents with built in tools (web search, weather, code execution, and more) or custom tools
🔍 RAG Build retrieval-augmented generation systems with vector stores and document processing
📝 Templates Build dynamic prompts with enhanced Mustache syntax
🧠 Memory Manage conversation history with built in memory strategies
📊 Observability Monitor agent behavior with events, logging, and robust error handling
🚀 Serve Host agents in servers with support for multiple protocols such as A2A and MCP
💾 Cache Optimize performance and reduce costs with intelligent caching
💿 Serialization Save and load agent state for persistence across sessions
🔄 Workflows Orchestrate multi-agent systems with complex execution flows

Quickstart

Installation

To install the Python library:

pip install beeai-framework

To install the TypeScript library:

npm install beeai-framework

Multi-Agent Example

import asyncio

from beeai_framework.agents.requirement import RequirementAgent
from beeai_framework.agents.requirement.requirements.conditional import ConditionalRequirement
from beeai_framework.backend import ChatModel
from beeai_framework.errors import FrameworkError
from beeai_framework.middleware.trajectory import GlobalTrajectoryMiddleware
from beeai_framework.tools import Tool
from beeai_framework.tools.handoff import HandoffTool
from beeai_framework.tools.search.wikipedia import WikipediaTool
from beeai_framework.tools.think import ThinkTool
from beeai_framework.tools.weather import OpenMeteoTool


async def main() -> None:
    knowledge_agent = RequirementAgent(
        llm=ChatModel.from_name("ollama:granite3.3:8b"),
        tools=[ThinkTool(), WikipediaTool()],
        requirements=[ConditionalRequirement(ThinkTool, force_at_step=1)],
        role="Knowledge Specialist",
        instructions="Provide answers to general questions about the world.",
    )

    weather_agent = RequirementAgent(
        llm=ChatModel.from_name("ollama:granite3.3:8b"),
        tools=[OpenMeteoTool()],
        role="Weather Specialist",
        instructions="Provide weather forecast for a given destination.",
    )

    main_agent = RequirementAgent(
        name="MainAgent",
        llm=ChatModel.from_name("ollama:granite3.3:8b"),
        tools=[
            ThinkTool(),
            HandoffTool(
                knowledge_agent,
                name="KnowledgeLookup",
                description="Consult the Knowledge Agent for general questions.",
            ),
            HandoffTool(
                weather_agent,
                name="WeatherLookup",
                description="Consult the Weather Agent for forecasts.",
            ),
        ],
        requirements=[ConditionalRequirement(ThinkTool, force_at_step=1)],
        # Log all tool calls to the console for easier debugging
        middlewares=[GlobalTrajectoryMiddleware(included=[Tool])],
    )

    question = "If I travel to Rome next weekend, what should I expect in terms of weather, and also tell me one famous historical landmark there?"
    print(f"User: {question}")

    try:
        response = await main_agent.run(question, expected_output="Helpful and clear response.")
        print("Agent:", response.last_message.text)
    except FrameworkError as err:
        print("Error:", err.explain())


if __name__ == "__main__":
    asyncio.run(main())

Source: python/examples/agents/experimental/requirement/handoff.py

Running the example

Note

To run this example, be sure that you have installed ollama with the granite3.3:8b model downloaded.

To run projects, use:

python [project_name].py

Explore more in our examples for Python and TypeScript.


Contribution guidelines

BeeAI framework is open-source and we ❤️ contributions.

To help build BeeAI, take a look at our:

  • Python contribution guidelines
  • TypeScript contribution guidelines

Bugs

We use GitHub Issues to manage bugs. Before filing a new issue, please check to make sure it hasn't already been logged. 🙏

Code of conduct

This project and everyone participating in it are governed by the Code of Conduct. By participating, you are expected to uphold this code. Please read the full text so that you know which actions may or may not be tolerated.

Legal notice

All content in these repositories including code has been provided by IBM under the associated open source software license and IBM is under no obligation to provide enhancements, updates, or support. IBM developers produced this code as an open source project (not as an IBM product), and IBM makes no assertions as to the level of quality nor security, and will not be maintaining this code going forward.

Maintainers

For information about maintainers, see MAINTAINERS.md.

Contributors

Special thanks to our contributors for helping us improve BeeAI framework.

Contributors list

Developed by contributors to the BeeAI project, this initiative is part of the Linux Foundation AI & Data program. Its development follows open, collaborative, and community-driven practices.

Tech Stack

PythonTypeScriptGoOllamaLLM

Installation

To install the Python library: pip install beeai-framework To install the TypeScript library: npm install beeai-framework

Open Live ProjectAudit Repo

Reviews0

Log in to write a review.

ActiveLast commit 5d ago
bug_report8open issues
Submitted August 23, 2024

auto_awesomeYour strongest next moves after beeai-framework