Get the FREE Ultimate OpenClaw Setup Guide →

rust-docs

🦀 Prevents outdated Rust code suggestions from AI assistants. This MCP server fetches current crate docs, uses embeddings/LLMs, and provides accurate context via a tool call.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio govcraft-rust-docs-mcp-server rustdocs_mcp_server <crate>@<version>

How to use

Rust Docs MCP Server runs a focused documentation tool for a single Rust crate. When started for a crate, it downloads the crate's HTML docs, generates embeddings, and exposes an MCP tool named query_rust_docs. You can use this to ask questions about the crate's API or usage, and get answers derived from the latest documentation context. Multiple distinct crate instances can run concurrently, each with its own cached data. The server communicates over stdio via the MCP protocol, and the crate name is exposed as a resource at crate://<crate_name> to help you verify which crate the server is configured for.

To use it, first start the server for the crate you care about (e.g., serde or tokio). The initial run for a given crate version and feature set will download docs, generate embeddings, and cache them for faster startup in subsequent sessions. After the initial setup, you can query the server with the query_rust_docs tool, providing natural-language questions about the crate. The tool will perform semantic search over the crate docs and return concise, context-aware answers generated from the retrieved documentation. Remember to provide your OpenAI API key via the OPENAI_API_KEY environment variable, as it powers both embedding generation and LLM-based summarization.

Typical use flow:

  • Start the server for a crate: rustdocs_mcp_server "crate@version" (optionally with --features for specific crate features).
  • Wait for the server to indicate it is ready (e.g., MCP Server listening on stdio).
  • Send questions through the query_rust_docs tool to retrieve relevant API information or usage guidance based on up-to-date docs.

How to install

Prerequisites:

  • A Rust toolchain (rustup, cargo) if building from source
  • OpenAI API key (OPENAI_API_KEY) with access to embedding and summarization models
  • Internet access for initial documentation download and embeddings

Install options: Option 1: Download a pre-built binary

  • Go to the project releases page for your OS and download the appropriate binary (rustdocs_mcp_server or rustdocs_mcp_server.exe).
  • Place the binary in a directory included in your PATH (e.g., /usr/local/bin or ~/bin).
  • Run the server for a crate using: rustdocs_mcp_server "crate@version" # optionally add -F features1,features2 if needed

Option 2: Build from source

Running the server:

  • Set your OpenAI API key: export OPENAI_API_KEY="sk-..."
  • Start for a specific crate: rustdocs_mcp_server "serde@^1.0" # or another crate/version
  • The server will fetch docs, generate embeddings, cache them, and then start the MCP server.

Additional notes

Notes and tips:

  • The first run for a given crate version (and feature set) may take longer due to downloading docs and generating embeddings. Subsequent runs will be faster thanks to the cache stored in the user data directory.
  • The server requires network access to fetch crate docs and interact with the OpenAI API.
  • If you change features for a crate, consider re-launching the server for that crate version/feature set to regenerate embeddings and refresh the cache.
  • The crate resource at crate://<crate_name> can help you verify which crate this server instance is configured for when building multi-crate workflows.
  • Ensure you have sufficient OpenAI API quota and monitor usage, as embeddings and LLM calls may incur costs.

Related MCP Servers

Sponsor this space

Reach thousands of developers ↗