maker-rs
Zero-error LLM execution via SPRT voting. Rust library and MCP server implementing the MAKER algorithm for mathematically-grounded error correction in long-horizon AI agent tasks. Research experiment based on arXiv:2511.09030
claude mcp add --transport stdio zircote-maker-rs /path/to/maker-mcp \ --env OPENAI_API_KEY="your-key" \ --env ANTHROPIC_API_KEY="your-key"
How to use
MAKER-rs implements the MAKER (Massively decomposed Agentic processes with K-margin Error Reduction) framework as an MCP server. It provides zero-error-like robustness for multi-step LLM tasks by running an error-corrected voting process (SPRT-based voting) across microagents and validating results via red-flag checks. The MCP server is designed to be integrated into an MCP configuration (for example, in Claude Code) where it can receive prompts, delegate work to LLM providers, and return a consensus answer that has been vetted through the voting and validation workflow. Typical use involves launching the compiled maker-mcp binary and wiring the server into your MCP ecosystem with the appropriate API keys for your LLM providers. The repository demonstrates how to build and run the server locally and how to reference it in an MCP config via the “command” path and environment variables.
To use MAKER within an MCP setup, place the maker server in your MCP configuration under an entry like "maker" and point the command to the built binary. Provide necessary environment variables (e.g., OPENAI_API_KEY, ANTHROPIC_API_KEY) so the server can call out to your preferred LLM providers. When a task is sent to this server, it will perform error-corrected voting across samples, apply red-flag validation, and return a robust result with the number of samples used. The README also shows example demos and validation tasks you can run locally to verify behavior before integrating it into production MCP flows.
How to install
Prerequisites:
- Rust toolchain (rustc, cargo) installed and Rust 1.75+ (as recommended by the README)
- Git (to clone or fetch the repository)
- Access to an MCP-compatible environment (optional, for integration)
- Install Rust and Cargo
- On macOS: install Xcode command line tools, then run: curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh source $HOME/.cargo/env
- On Linux: install rustup via https://rustup.rs/ and then: rustup default stable
- Clone the repository
- git clone https://github.com/zircote/maker-rs.git
- cd maker-rs
- Build the MCP server (release)
- cargo build --release
- Run the MCP server locally (example)
- cargo run --bin maker-mcp
- Prepare MCP configuration
- See the mcp_config example in this document to wire the server into your MCP environment.
Prerequisites recap:
- Rust toolchain (stable, 1.75+ as suggested by the project)
- Network access to LLM providers (via API keys)
- An MCP runner/orchestrator to load the server as an MCP endpoint
Additional notes
Notes and tips:
- MAKER is experimental and not recommended for production use. APIs and behavior may change as the research evolves.
- Ensure you provide valid API keys for OpenAI, Anthropic, or other supported providers; these keys should be kept secure and injected via environment variables in the MCP configuration (as shown in the example).
- The server is intended to be integrated into an MCP workflow where it receives tasks, runs SPRT-based voting across samples, and returns a validated result. If you encounter issues, verify that the binary path in the MCP config is correct and that the environment variables are accessible to the running process.
- For development and testing, you can follow the Quick Start in the README to run demos (e.g., Hanoi and Arithmetic demos) to ensure the voting and validation loop behaves as expected before integrating into a broader MCP setup.
Related MCP Servers
Wax
Sub-Millisecond RAG on Apple Silicon. No Server. No API. One File. Pure Swift
packt-netops-ai-workshop
🔧 Build Intelligent Networks with AI
ultrafast
High-performance, ergonomic Model Context Protocol (MCP) implementation in Rust
mode-manager
MCP Memory Agent Server - A VS Code chatmode and instruction manager with library integration
agentic-search
A Model Context Protocol (MCP) server that provides agentic search capabilities with support for vector search using Qdrant, full-text search using TiDB, or both combined.
limps
limps your Local Intelligent MCP Planning Server across AI assistants. No subscriptions, no cloud—run it locally. Version control your planning docs in git. No more context drift—one shared source of truth across Claude, Cursor, Copilot, and any MCP tool.