Get the FREE Ultimate OpenClaw Setup Guide →

bookworm

MCP server for Rust documentation

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio dcdpr-bookworm cargo run --bin wrm-mcp

How to use

Bookworm provides an MCP server that serves documentation data for crates from docs.rs via a set of tools designed for large language models. The server runs locally and exposes a collection of capabilities (tools) such as crates_search to discover crates, crate_search_items to enumerate items within a crate (like modules, structs, enums, and functions), and crate_resource to fetch specific resources such as readmes, item listings, or source code for a given crate and version. You can integrate this MCP server with clients like Claude.ai or other MCP clients by including the server in your mcpServers configuration and invoking its tools through the MCP protocol. The server is implemented in Rust and exposes the wrm-mcp binary that serves the documentation data and tooling endpoints for querying and retrieval.

To use the server with an MCP client, start the server locally and point your client to the configured address. The client can then issue tool requests such as crates_search for discovering crates matching a query, crate_search_items for detailed item metadata, and crate_resource for retrieving specific resources (readmes, items, or source files) for a selected crate/version. The tools are designed to help an LLM reason about crate documentation by providing structured results and resource URIs that can be fetched by the client during conversation or reasoning steps.

How to install

Prerequisites:

  • Rust toolchain (stable) with cargo installed. Install via rustup if needed: curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
  • Git (to clone the repository) if you are building from source

Installation steps:

  1. Clone the repository (or ensure you have the Rust project checked out): git clone https://github.com/your-org/bookworm.git cd bookworm

  2. Build the wrm-mcp binary (from the crates/wrm-mcp path or the project root, depending on repo structure): cargo build --release --bin wrm-mcp

  3. Run the MCP server locally: cargo run --bin wrm-mcp

  4. Verify the server is running by accessing the appropriate endpoint or logs. Configure your MCP client to point to the local server (e.g., http://localhost:PORT if applicable). If the server requires a specific address or port, adjust your client configuration accordingly.

Note: If your environment uses a different workspace layout, navigate to the wrm-mcp package directory and run the cargo commands from there.

Additional notes

Tips and notes:

  • Ensure Rust toolchain is up to date to avoid build issues: rustup update
  • If the binary fails to start due to port or address binding, check for existing processes using the same port and stop them before retrying.
  • Some MCP clients may require explicit endpoint configuration or authentication; consult your client’s docs for how to point to the bookworm MCP server.
  • The tooling exposed by the server (crates_search, crate_search_items, crate_resource, etc.) may return large payloads; consider streaming or paginating responses if supported by your client.
  • For crate_resource, you can fetch items like readmes, metadata, and source resources using the documented URIs, enabling rich reasoning about crate contents within the LM’s context.

Related MCP Servers

Sponsor this space

Reach thousands of developers