bookworm
MCP server for Rust documentation
claude mcp add --transport stdio dcdpr-bookworm cargo run --bin wrm-mcp
How to use
Bookworm provides an MCP server that serves documentation data for crates from docs.rs via a set of tools designed for large language models. The server runs locally and exposes a collection of capabilities (tools) such as crates_search to discover crates, crate_search_items to enumerate items within a crate (like modules, structs, enums, and functions), and crate_resource to fetch specific resources such as readmes, item listings, or source code for a given crate and version. You can integrate this MCP server with clients like Claude.ai or other MCP clients by including the server in your mcpServers configuration and invoking its tools through the MCP protocol. The server is implemented in Rust and exposes the wrm-mcp binary that serves the documentation data and tooling endpoints for querying and retrieval.
To use the server with an MCP client, start the server locally and point your client to the configured address. The client can then issue tool requests such as crates_search for discovering crates matching a query, crate_search_items for detailed item metadata, and crate_resource for retrieving specific resources (readmes, items, or source files) for a selected crate/version. The tools are designed to help an LLM reason about crate documentation by providing structured results and resource URIs that can be fetched by the client during conversation or reasoning steps.
How to install
Prerequisites:
- Rust toolchain (stable) with cargo installed. Install via rustup if needed: curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
- Git (to clone the repository) if you are building from source
Installation steps:
-
Clone the repository (or ensure you have the Rust project checked out): git clone https://github.com/your-org/bookworm.git cd bookworm
-
Build the wrm-mcp binary (from the crates/wrm-mcp path or the project root, depending on repo structure): cargo build --release --bin wrm-mcp
-
Run the MCP server locally: cargo run --bin wrm-mcp
-
Verify the server is running by accessing the appropriate endpoint or logs. Configure your MCP client to point to the local server (e.g., http://localhost:PORT if applicable). If the server requires a specific address or port, adjust your client configuration accordingly.
Note: If your environment uses a different workspace layout, navigate to the wrm-mcp package directory and run the cargo commands from there.
Additional notes
Tips and notes:
- Ensure Rust toolchain is up to date to avoid build issues: rustup update
- If the binary fails to start due to port or address binding, check for existing processes using the same port and stop them before retrying.
- Some MCP clients may require explicit endpoint configuration or authentication; consult your client’s docs for how to point to the bookworm MCP server.
- The tooling exposed by the server (crates_search, crate_search_items, crate_resource, etc.) may return large payloads; consider streaming or paginating responses if supported by your client.
- For crate_resource, you can fetch items like readmes, metadata, and source resources using the documented URIs, enabling rich reasoning about crate contents within the LM’s context.
Related MCP Servers
rust -schema
A type-safe implementation of the official Model Context Protocol (MCP) schema in Rust.
turbovault
MCP server that transforms your Obsidian vault into an intelligent knowledge system
tsrs
tushare rust mcp server
mcp-loxone
An opinionated Model Context Protocol (MCP) server for controlling Loxone home automation systems.
ultrafast
High-performance, ergonomic Model Context Protocol (MCP) implementation in Rust
firecrawl -zed
Firecrawl MCP Server for Zed