arch
MCP server that gives code agents architectural context - ERD diagrams, code policies, API docs, service relationships, and best practices.
claude mcp add --transport stdio citizen4our-arch-mcp-server docker run -i arch-mcp-server \ --env DOCS_ROOT="/path/to/docs/root"
How to use
Arch MCP Server provides architectural context to AI coding agents by exposing a suite of tools that allow reading, discovering, and analyzing architectural and technical documentation. The server can retrieve individual documentation resources, list resources with filters, fetch all ADR documents, generate a project overview, and query agreements by language. Use these tools to surface C4 diagrams, ERDs, ADRs, OpenAPI specs, and backend/frontend documentation. The server communicates over HTTP and is intended to run as a separate process, so Cursor or other clients can send requests to the documented endpoints and receive structured JSON responses that integrate into your AI workflows.
Key capabilities include: get_resource_content to fetch specific docs via docs:// paths; get_docs_list to filter and paginate documentation across areas, languages, and categories; get_all_adr_documents to retrieve ADRs; get_project_overview to obtain a comprehensive, grouped snapshot of a project’s documentation; and get_agreements to find API contracts and technical specifications by programming language. These tools enable targeted discovery (e.g., architecture area with C4 diagrams, or OpenAPI specs for a service) and overview analytics for project teams.
To use the server in your workflow, first start the MCP server, then configure Cursor or your integrating client with the arch-mcp server endpoint. Typical queries include requesting an overview for a project, reading a specific architecture document, or listing all OpenAPI specs for a particular area. The outcome is structured JSON suitable for downstream processing and for presenting insights to developers and architects.
How to install
Prerequisites:
- Rust 1.70+ (2024 edition) and Cargo for building from source
- Optional: Docker and Docker Compose if you prefer running via containers
Install from source (recommended for development):
- Install Rust:
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
- Clone the repository (if not already):
git clone https://github.com/your-org/arch-mcp-server.git
cd arch-mcp-server
- Build in release mode:
cargo build --release
- Run the server locally (with a docs root path):
cargo run --release -- --docs-root ./example_docs/docs/content
Alternative: Run via Docker (if you have a prebuilt image arch-mcp-server):
docker compose up
Notes:
- The server exposes HTTP endpoints; ensure network accessibility from your client (Cursor or otherwise).
- If you modify docs, restart the server to refresh indexing.
Documentation resources like example_docs/docs/content must exist and be structured according to arch-mcp.toml mappings as described in the README.
Additional notes
Tips and common considerations:
- The MCP server uses HTTP transport; do not rely on stdio transport as logs may interfere with JSON protocol.
- Ensure the docs-root you point to contains the arch-mcp.toml mapping and the documentation repository structure expected by the server.
- For Cursor integration, set the mcp.json in your user profile with the arch-mcp server entry and the correct URL, either local (http://127.0.0.1:8010/mcp) or remote.
- Environment variables can be used to customize paths or runtime behaviour; keep them documented and consistent across environments.
- When filtering with get_docs_list, combine area, lang, and category as needed to locate the exact resources (e.g., area=architecture&category=c4 or area=openapi&category=activation).
- ADRs and OpenAPI specs can be large; use pagination (page/limit) to manage results in your tooling.
Related MCP Servers
goose
an open source, extensible AI agent that goes beyond code suggestions - install, execute, edit, and test with any LLM
cunzhi
告别AI提前终止烦恼,助力AI更加持久
probe
AI-friendly semantic code search engine for large codebases. Combines ripgrep speed with tree-sitter AST parsing. Powers AI coding assistants with precise, context-aware code understanding.
mcp-center
A centralized platform for managing and connecting MCP servers. MCP Center provides a high-performance proxy service that enables seamless communication between MCP clients and multiple MCP servers.
backlog -rust
MCP server for Backlog, project management service.
perplexity-web-api
🔍 Perplexity AI MCP without API key