rtfmbro
rtfmbro provides always-up-to-date, version-specific package documentation as context for coding agents. An alternative to context7
claude mcp add --transport stdio marckrenn-rtfmbro-mcp docker run -i rtfmbro/mcp:latest \ --env RTFMBRO_BASE_URL="https://rtfmbro.smolosoft.dev/mcp/" \ --env RTFMBRO_CACHE_DIR="path/to/cache (optional)"
How to use
rtfmbro is an MCP server that provides real-time, version-aware package documentation. It exposes tools that let AI agents fetch and inspect documentation from exact package versions across ecosystems such as Python and npm. The server offers four primary tools: get_readme to retrieve the README for a given package version, get_documentation_tree to generate a complete tree view of all documentation files, read_files to fetch specific documentation files (with optional line ranges), and search_github_repositories to discover repositories via the GitHub API. To use the server, connect via the MCP client and request the desired tool with the required parameters (package, version, ecosystem). The results give you precise, version-matched documentation directly from the repository, ensuring your agents work with up-to-date, contextually accurate material.
How to install
Prerequisites:
- Docker installed and running on the host
- Access to pull the rtfmbro MCP image from Docker Hub (rtfmbro/mcp:latest)
Installation steps:
-
Install Docker (examples):
- macOS: Install Docker Desktop from https://www.docker.com/products/docker-desktop
- Windows: Install Docker Desktop from https://www.docker.com/products/docker-desktop
- Linux (example for Debian/Ubuntu): sudo apt-get update sudo apt-get install docker.io sudo systemctl enable --now docker
-
Run the MCP server container: docker run -d --name rtfmbro-mcp -i rtfmbro/mcp:latest
-
Verify the server is running: docker ps | grep rtfmbro
-
Configure your MCP client to connect to the server. If you’re using a local environment, you might expose the port (if the image exposes one) and point to http://localhost:<port> or the appropriate URL provided by your deployment.
-
Optional: customize environment variables by providing them to the container at runtime using -e VAR=value flags as needed.
Additional notes
Tips and caveats:
- rtfmbro supports Python (PyPI) and Node.js (npm) ecosystems with full documentation fetch capabilities; other ecosystems may be in varying states of support.
- The tools return data in plain text or structured strings; integrate parsing as needed in your agent.
- If documentation files are large, consider using read_files with scoped paths and line ranges to minimize payloads.
- Ensure your MCP client handles rate limits when using search_github_repositories to avoid hitting GitHub API limits.
- When running in Docker, you may want to mount a local cache directory and set RTFMBRO_CACHE_DIR to persist results across restarts.
- If you encounter CORS or connectivity issues, verify container networking and any reverse proxy configuration in your deployment.
Related MCP Servers
PPTAgent
An Agentic Framework for Reflective PowerPoint Generation
building-an-agentic-system
An in-depth book and reference on building agentic systems like Claude Code
agentql
Model Context Protocol server that integrates AgentQL's data extraction capabilities.
zerodha
Zerodha MCP Server & Client - AI Agent (w/Agno & w/Google ADK)
fast -telegram
Telegram MCP Server and HTTP-MTProto bridge | Multi-user auth, intelligent search, file sending, web setup | Docker & PyPI ready
mcp -docy
A Model Context Protocol server that provides documentation access capabilities. This server enables LLMs to search and retrieve content from documentation websites by scraping them with crawl4ai. Built with FastMCP v2.