rust-docs
🦀 Prevents outdated Rust code suggestions from AI assistants. This MCP server fetches current crate docs, uses embeddings/LLMs, and provides accurate context via a tool call.
claude mcp add --transport stdio govcraft-rust-docs-mcp-server rustdocs_mcp_server <crate>@<version>
How to use
Rust Docs MCP Server runs a focused documentation tool for a single Rust crate. When started for a crate, it downloads the crate's HTML docs, generates embeddings, and exposes an MCP tool named query_rust_docs. You can use this to ask questions about the crate's API or usage, and get answers derived from the latest documentation context. Multiple distinct crate instances can run concurrently, each with its own cached data. The server communicates over stdio via the MCP protocol, and the crate name is exposed as a resource at crate://<crate_name> to help you verify which crate the server is configured for.
To use it, first start the server for the crate you care about (e.g., serde or tokio). The initial run for a given crate version and feature set will download docs, generate embeddings, and cache them for faster startup in subsequent sessions. After the initial setup, you can query the server with the query_rust_docs tool, providing natural-language questions about the crate. The tool will perform semantic search over the crate docs and return concise, context-aware answers generated from the retrieved documentation. Remember to provide your OpenAI API key via the OPENAI_API_KEY environment variable, as it powers both embedding generation and LLM-based summarization.
Typical use flow:
- Start the server for a crate: rustdocs_mcp_server "crate@version" (optionally with --features for specific crate features).
- Wait for the server to indicate it is ready (e.g., MCP Server listening on stdio).
- Send questions through the query_rust_docs tool to retrieve relevant API information or usage guidance based on up-to-date docs.
How to install
Prerequisites:
- A Rust toolchain (rustup, cargo) if building from source
- OpenAI API key (OPENAI_API_KEY) with access to embedding and summarization models
- Internet access for initial documentation download and embeddings
Install options: Option 1: Download a pre-built binary
- Go to the project releases page for your OS and download the appropriate binary (rustdocs_mcp_server or rustdocs_mcp_server.exe).
- Place the binary in a directory included in your PATH (e.g., /usr/local/bin or ~/bin).
- Run the server for a crate using: rustdocs_mcp_server "crate@version" # optionally add -F features1,features2 if needed
Option 2: Build from source
- Ensure Rust is installed: https://rustup.rs/
- Clone the repository and build: git clone https://github.com/Govcraft/rust-docs-mcp-server.git cd rust-docs-mcp-server cargo build --release
- The resulting binary will be at target/release/rustdocs_mcp_server. Move it into your PATH as desired.
Running the server:
- Set your OpenAI API key: export OPENAI_API_KEY="sk-..."
- Start for a specific crate: rustdocs_mcp_server "serde@^1.0" # or another crate/version
- The server will fetch docs, generate embeddings, cache them, and then start the MCP server.
Additional notes
Notes and tips:
- The first run for a given crate version (and feature set) may take longer due to downloading docs and generating embeddings. Subsequent runs will be faster thanks to the cache stored in the user data directory.
- The server requires network access to fetch crate docs and interact with the OpenAI API.
- If you change features for a crate, consider re-launching the server for that crate version/feature set to regenerate embeddings and refresh the cache.
- The crate resource at crate://<crate_name> can help you verify which crate this server instance is configured for when building multi-crate workflows.
- Ensure you have sufficient OpenAI API quota and monitor usage, as embeddings and LLM calls may incur costs.
Related MCP Servers
grepai
Semantic Search & Call Graphs for AI Agents (100% Local)
groundhog
Groundhog's primary purpose is to teach people how Cursor and all these other coding agents work under the hood. If you understand how these coding assistants work from first principles, then you can drive these tools harder (or perhaps make your own!).
claude-code-open
Open source AI coding platform with Web IDE, multi-agent system, 37+ tools, MCP protocol. MIT licensed.
go-utcp
Official Go implementation of the UTCP
vibe-check
Stop AI coding disasters before they cost you weeks. Real-time anti-pattern detection for vibe coders who love AI tools but need a safety net to avoid expensive overengineering traps.
mcp-ragex
MCP server for intelligent code search: semantic (RAG), symbolic (tree-sitter), and regex (ripgrep) search modes. Built for Claude Code and AI coding assistants.