BeeAI platform is an open platform to help you discover, run, and compose AI agents from any framework. This tutorial demonstrates how to integrate BeeAI platform agents with the BeeAI Framework using the BeeAIPlatformAgent
class.
BeeAI platform is an open agent platform, while the BeeAI framework is an SDK for developing agents in Python or TypeScript.
Prerequisites
- BeeAI platform installed and running locally
- BeeAI Framework installed with
pip install beeai-framework
- Extension for BeeAI Platform installed with
pip install 'beeai-framework[beeai-platform]'
- Project setup:
- Create project directory:
mkdir beeai-remote-agent && cd beeai-remote-agent
- Set up Python virtual environment:
python -m venv venv && source venv/bin/activate
- Create agent module:
mkdir my_agents && touch my_agents/remote_agent.py
The BeeAIPlatformAgent
class allows you to connect to any agent hosted on the BeeAI platform. This means that you can interact with agents built from any framework!
Use BeeAIPlatformAgent
when:
- You’re connecting specifically to the BeeAI Platform services.
- You want forward compatibility for the BeeAI Platform, no matter which protocol it is based on.
Here’s a simple example that uses the built-in chat
agent:
import asyncio
import sys
import traceback
from beeai_framework.adapters.beeai_platform.agents import BeeAIPlatformAgent
from beeai_framework.errors import FrameworkError
from beeai_framework.memory.unconstrained_memory import UnconstrainedMemory
from examples.helpers.io import ConsoleReader
async def main() -> None:
reader = ConsoleReader()
agents = await BeeAIPlatformAgent.from_platform(url="http://127.0.0.1:8333", memory=UnconstrainedMemory())
agent_name = "Granite chat agent"
try:
agent = next(agent for agent in agents if agent.name == agent_name)
except StopIteration:
raise ValueError(f"Agent with name `{agent_name}` not found") from None
for prompt in reader:
# Run the agent and observe events
response = await agent.run(prompt).on(
"update",
lambda data, event: (reader.write("Agent 🤖 (debug) : ", data)),
)
reader.write("Agent 🤖 : ", response.last_message.text)
if __name__ == "__main__":
try:
asyncio.run(main())
except FrameworkError as e:
traceback.print_exc()
sys.exit(e.explain())
Usage in Workflow
You can compose multiple BeeAI platform agents into advanced workflows using the BeeAI framework’s workflow capabilities. This example demonstrates a research and content creation pipeline:
In this example, the GPT Researcher
agent researches a topic, and the Podcast creator
takes the research report and produces a podcast transcript.
You can adjust or expand this pattern to orchestrate more complex multi agent workflows.
import asyncio
import sys
import traceback
from pydantic import BaseModel
from beeai_framework.adapters.beeai_platform import BeeAIPlatformAgent
from beeai_framework.errors import FrameworkError
from beeai_framework.memory.unconstrained_memory import UnconstrainedMemory
from beeai_framework.workflows import Workflow
from examples.helpers.io import ConsoleReader
async def main() -> None:
reader = ConsoleReader()
class State(BaseModel):
topic: str
research: str | None = None
output: str | None = None
agents = await BeeAIPlatformAgent.from_platform(url="http://127.0.0.1:8333", memory=UnconstrainedMemory())
async def research(state: State) -> None:
# Run the agent and observe events
try:
research_agent = next(agent for agent in agents if agent.name == "GPT Researcher")
except StopIteration:
raise ValueError("Agent 'GPT Researcher' not found") from None
response = await research_agent.run(state.topic).on(
"update",
lambda data, _: (reader.write("Agent 🤖 (debug) : ", data)),
)
state.research = response.last_message.text
async def podcast(state: State) -> None:
# Run the agent and observe events
try:
podcast_agent = next(agent for agent in agents if agent.name == "Podcast creator")
except StopIteration:
raise ValueError("Agent 'Podcast creator' not found") from None
response = await podcast_agent.run(state.research or "").on(
"update",
lambda data, _: (reader.write("Agent 🤖 (debug) : ", data)),
)
state.output = response.last_message.text
# Define the structure of the workflow graph
workflow = Workflow(State)
workflow.add_step("research", research)
workflow.add_step("podcast", podcast)
# Execute the workflow
result = await workflow.run(State(topic="Connemara"))
print("\n*********************")
print("Topic: ", result.state.topic)
print("Research: ", result.state.research)
print("Output: ", result.state.output)
if __name__ == "__main__":
try:
asyncio.run(main())
except FrameworkError as e:
traceback.print_exc()
sys.exit(e.explain())
Source: python/examples/workflows/remote.py
The BeeAIPlatformServer
class exposes agents built with the BeeAI Framework as an ACP server. It is automatically registered with the platform, allowing you to access and use the agents directly within the framework.
BeeAI platform supports only one agent per server.
from beeai_framework.adapters.beeai_platform.backend.chat import BeeAIPlatformChatModel
from beeai_framework.adapters.beeai_platform.serve.server import BeeAIPlatformMemoryManager, BeeAIPlatformServer
from beeai_framework.agents.requirement import RequirementAgent
from beeai_framework.backend import ChatModelParameters
from beeai_framework.memory import UnconstrainedMemory
from beeai_framework.middleware.trajectory import GlobalTrajectoryMiddleware
from beeai_framework.tools.search.duckduckgo import DuckDuckGoSearchTool
from beeai_framework.tools.weather import OpenMeteoTool
try:
from beeai_sdk.a2a.extensions.ui.agent_detail import AgentDetail
except ModuleNotFoundError as e:
raise ModuleNotFoundError(
"Optional module [beeai-platform] not found.\nRun 'pip install \"beeai-framework[beeai-platform]\"' to install."
) from e
def main() -> None:
agent = RequirementAgent(
llm=BeeAIPlatformChatModel(
preferred_models=["openai:gpt-4o", "ollama:llama3.1:8b"], parameters=ChatModelParameters(stream=True)
),
tools=[DuckDuckGoSearchTool(), OpenMeteoTool()],
memory=UnconstrainedMemory(),
middlewares=[GlobalTrajectoryMiddleware()],
)
# Runs HTTP server that registers to BeeAI platform
server = BeeAIPlatformServer(config={"configure_telemetry": False}, memory_manager=BeeAIPlatformMemoryManager())
server.register(
agent,
name="Framework chat agent",
description="Simple chat agent", # (optional)
detail=AgentDetail(interaction_mode="multi-turn"), # default is multi-turn (optional)
)
server.serve()
if __name__ == "__main__":
main()
# run: beeai agent run chat_agent