MCPServer allows you to expose your components (Tools, Agents, Chat Models, Runnable) to external systems that support the Model Context Protocol (MCP) standard, enabling seamless integration with LLM tools ecosystems.Key benefits
Fast setup with minimal configuration
Support for multiple transport options
Register multiple tools on a single server
Custom server settings and instructions
Copy
Ask AI
from beeai_framework.adapters.mcp.serve.server import MCPServer, MCPServerConfig, MCPSettingsfrom beeai_framework.tools import toolfrom beeai_framework.tools.types import StringToolOutputfrom beeai_framework.tools.weather.openmeteo import OpenMeteoTool@tooldef reverse_tool(word: str) -> StringToolOutput: """A tool that reverses a word""" return StringToolOutput(result=word[::-1])def main() -> None: """Create an MCP server with custom config, register ReverseTool and OpenMeteoTool to the MCP server and run it.""" config = MCPServerConfig(transport="streamable-http", settings=MCPSettings(port=8001)) # optional server = MCPServer(config=config) server.register_many([reverse_tool, OpenMeteoTool()]) server.serve()if __name__ == "__main__": main()
You can also register agents, chat models, or any other runnable.