allman
High-performance MCP agent mail server — lock-free message routing, NRT search, Git audit trail. Built in Rust.
claude mcp add --transport stdio copyleftdev-allman docker run -p 8000:8000 -p 8001:8001 --env INDEX_PATH=allman_index --env REPO_ROOT=allman_repo --env RUST_LOG=allman=debug --env VLLM_PORT=8001 copyleftdev/allman
How to use
Allman exposes an MCP-compatible API on http://localhost:8000/mcp. It acts as a high-performance mail agent hub that coordinates agents, routes messages, and provides a full-text search-backed inbox. You can register agents, send messages between agents, fetch unread inbox items, and perform full-text searches across all stored messages. The system emphasizes a lock-free hot path for create_agent, send_message, and get_inbox, ensuring microsecond-latency operations for common actions. If you’re using the Docker setup, ensure the container is running and that port 8000 is accessible from your client.
The MCP tools available are:
- create_agent: Register an agent with project_key, name_hint, program, and model.
- send_message: Deliver a message from one agent to another, including subject and body, plus project context.
- get_inbox: Retrieve and drain unread messages for a given agent.
- search_messages: Perform a full-text search across all indexed messages using Tantivy-backed search.
Clients can connect to the /mcp endpoint via JSON-RPC or a compatible MCP client library to invoke these tools, receive structured responses, and handle errors accordingly.
How to install
Prerequisites:
- Docker (recommended) or Rust toolchain for building from source
- Optional: NVIDIA GPU with nvidia-container-toolkit if you plan to run vLLM inference
Option A - Install with Docker (recommended):
- Pull and run the Allman Docker image (adjust image name if necessary):
docker pull copyleftdev/allman
docker compose up -d # if using docker-compose.yml configured for Allman
Or run a single container with port mappings:
docker run -d --name allman -p 8000:8000 -p 8001:8001 -e INDEX_PATH=allman_index -e REPO_ROOT=allman_repo -e RUST_LOG=allman=debug copyleftdev/allman - Verify the service is listening on port 8000 (and 8001 for vLLM if used).
Option B - Build from source (Rust):
- Prerequisites:
- Rust 1.75+ and Cargo
- Optional: Docker for containerized deployment
- Install and build: git clone https://github.com/copyleftdev/allman.git cd allman cargo build --release
- Run locally:
cargo run --release
Server binds to 0.0.0.0:8000 by default; ensure environment variables as needed.
- Configuration: Set environment variables as needed (INDEX_PATH, REPO_ROOT, RUST_LOG). You can also configure via a .env file matching the README’s guidance.
Additional notes
Environment variables:
- INDEX_PATH: Tantivy index directory (default allman_index)
- REPO_ROOT: Git audit trail directory (default allman_repo)
- RUST_LOG: Log level filter (default allman=debug) Networking:
- Exposed ports: 8000 for MCP API, 8001 for vLLM inference (when used). If you’re running behind a reverse proxy, ensure proper routing and TLS termination as needed. Persistence:
- Allman uses a batched indexing pipeline with a separate OS thread for Git commits. The hot path is lock-free for low latency operations, but ensure the filesystem paths for index and repo directories are writable by the process. Common issues:
- If /mcp endpoint is unreachable, verify container is running and ports are mapped correctly.
- When using GPU inference, ensure the NVIDIA toolkit is installed and port 8001 is accessible to your vLLM instance.
- If index or repo directories are missing, create them or set INDEX_PATH/REPO_ROOT to valid writable paths.
Related MCP Servers
code-mode
🔌 Plug-and-play library to enable agents to call MCP and UTCP tools via code execution.
erpnext
Connect AI assistants to your ERPNext instance via the Model Context Protocol (MCP) using the official Frappe API.
crawl4ai
🕷️ A lightweight Model Context Protocol (MCP) server that exposes Crawl4AI web scraping and crawling capabilities as tools for AI agents. Similar to Firecrawl's API but self-hosted and free. Perfect for integrating web scraping into your AI workflows with OpenAI Agents SDK, Cursor, Claude Code, and other MCP-compatible tools.
mcp-framework
Rust MCP framework for building AI agents
ultrafast
High-performance, ergonomic Model Context Protocol (MCP) implementation in Rust
xcatcher -manifest
Agent-first Remote MCP for X (Twitter) batch crawling with x402 USDC top-up (Base/Solana). Includes OpenAPI + copy-paste ADK/curl E2E examples.