mcp-framework
Rust MCP framework for building AI agents
claude mcp add --transport stdio koki7o-mcp-framework cargo run \ --env OPENAI_API_KEY="optional (set in .env if using OpenAI)" \ --env ANOTHROPIC_API_KEY="optional (set in .env if using Claude/Anthropic)"
How to use
The MCP Framework Rust implementation provides a production-ready environment for building AI agents that communicate with MCP servers. It includes an MCP client/server architecture, multi-LLM integration (Anthropic/Claude and OpenAI), and a Web Inspector for debugging and testing tools. This server lets you register custom tools that agents can invoke via MCP protocol, manage conversations, and run agents that orchestrate multiple LLMs and tools in parallel. To get started, clone the repository, install Rust, and run the provided examples to see how a single tool server, a tool-rich server with an inspector, and browser automation flows work with both Claude and OpenAI backends.
How to install
Prerequisites:
- Rust toolchain (Rust 1.70+)
- Optional OpenAI or Anthropic API keys for demonstrations
- Install Rust (if not already installed):
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
- Clone the repository and navigate into it:
git clone https://github.com/koki7o/mcp-framework
cd mcp-framework
- Set up environment variables (optional but recommended):
# Copy example env and edit keys as needed
cp .env.example .env
- Build and run the server (minimal):
# Run the default server (one tool)
cargo run
- Run a server with tools (examples) or with the inspector UI:
# Server with 8 tools and Inspector UI
cargo run --example server_with_tools
- If you want to expose other tool or start with a custom server, follow the server example in the docs to implement ToolHandler and register tools, then run the binary with cargo as shown above.
Additional notes
Notes and tips:
- The MCP setup expects optional API keys for Claude/Anthropic or OpenAI; set ANTHROPIC_API_KEY and/or OPENAI_API_KEY in a .env file as recommended.
- The server supports an Inspector UI accessible at the provided local URL when using the examples that enable tooling and UI.
- If you modify environment variables, restart the server to pick up changes.
- Default commands assume you are running from the project root; use cargo run with appropriate flags if you launch from a different directory.
- This Rust implementation emphasizes zero-cost abstractions, async runtimes (Tokio), and safe concurrency; ensure you use compatible Rust nightly if required by certain features in examples.
Related MCP Servers
context-space
Ultimate Context Engineering Infrastructure, starting from MCPs and Integrations
mcp-probe
A Model Context Protocol (MCP) client library and debugging toolkit in Rust. This foundation provides both a production-ready SDK for building MCP integrations and the core architecture for an interactive debugger.
pluggedin-app
The Crossroads for AI Data Exchanges. A unified, self-hostable web interface for discovering, configuring, and managing Model Context Protocol (MCP) servers—bringing together AI tools, workspaces, prompts, and logs from multiple MCP sources (Claude, Cursor, etc.) under one roof.
MediaWiki
Model Context Protocol (MCP) Server to connect your AI with any MediaWiki
zerodha
Zerodha MCP Server & Client - AI Agent (w/Agno & w/Google ADK)
mindbridge
MindBridge is an AI orchestration MCP server that lets any app talk to any LLM — OpenAI, Anthropic, DeepSeek, Ollama, and more — through a single unified API. Route queries, compare models, get second opinions, and build smarter multi-LLM workflows.