mcpr
Model Context Protocol (MCP) implementation in Rust
claude mcp add --transport stdio conikeec-mcpr cargo install mcpr --version 0.2.3 \ --env RUST_LOG="info"
How to use
mcpr is a Rust implementation of the Model Context Protocol (MCP). It provides a high-level client and server architecture to connect AI assistants to data sources and tools via multiple transports, such as stdio and Server-Sent Events (SSE). This repository ships with a project generator, mock transports for testing, and a CLI to scaffold new MCP projects. Use mcpr to run MCP servers that expose tools and to build clients that can initialize, call tools, and shut down cleanly. The README emphasizes reliable MCP communication and provides ready-made examples for a high-level client, a high-level server, and generated projects that demonstrate both stdio and SSE transports.
To use MCPR, first install the mcpr crate, then install or build your server. The stdio transport enables rapid in-process testing by communicating via standard I/O, while the SSE transport enables web-oriented, server-driven interactions. The library also offers a CLI for generating servers and clients, a project generator for rapid scaffolding, and mock transports to simulate end-to-end MCP interactions during development and testing.
In practice, you would: (1) create a server with a set of tools, (2) start the server using one of the supported transports, and (3) build or run a client that calls the server’s tools through the MCP protocol. The examples in the repository illustrate initializing a client, calling a tool by name, and gracefully shutting down the client, as well as configuring and running a server with a tool registry.
How to install
Prerequisites:
- Rust and Cargo installed (https://www.rust-lang.org/tools/install)
- Optional: macOS/Linux environment with build tooling; Windows users may prefer WSL or a Rust-enabled shell
- Install the mcpr CLI globally (or build from source):
# Install the mcpr CLI from crates.io
cargo install mcpr
# Or build from source (after cloning the repo)
# git clone https://github.com/your-org/mcpr
# cd mcpr
# cargo install --path .
- Verify installation:
mcpr --version
- Add mcpr as a dependency to your project (if you’re consuming as a library):
# In your Cargo.toml
[dependencies]
mcpr = "0.2.3"
- Build and run a server example or your own server using cargo:
# Build the example server in a generated project (example path may vary)
cargo build --package mcpr
# Run a typical server using the generated binary (example path)
./target/debug/my-mcpr-server
Optional: Use the project generator to scaffold new MCP projects with different transports (stdio or SSE) and then build/run them following the generated instructions in the project directory.
Additional notes
Key notes:
- Version 0.2.0 was yanked due to issues with the SSE transport. Use 0.2.3 or later as recommended by the project.
- mcpr supports multiple transport options (stdio, SSE) and plans for WebSocket transport in the future.
- The repository includes a high-level client/server API, a CLI for scaffolding, a project generator, and mock transports for testing.
- When working with generated projects, you can switch between the local development version of mcpr and the crates.io release by editing Cargo.toml accordingly.
- If you’re using SSE in production, be mindful of the server URL and port configuration (default SSE server port is 8080 in examples).
- For testing, leverage the mock SSE transport to validate initialization, tool calls, and error handling before connecting to real data sources.
Related MCP Servers
goose
an open source, extensible AI agent that goes beyond code suggestions - install, execute, edit, and test with any LLM
cunzhi
告别AI提前终止烦恼,助力AI更加持久
probe
AI-friendly semantic code search engine for large codebases. Combines ripgrep speed with tree-sitter AST parsing. Powers AI coding assistants with precise, context-aware code understanding.
mcp-center
A centralized platform for managing and connecting MCP servers. MCP Center provides a high-performance proxy service that enables seamless communication between MCP clients and multiple MCP servers.
backlog -rust
MCP server for Backlog, project management service.
perplexity-web-api
🔍 Perplexity AI MCP without API key