AutoDocs
We handle what engineers and IDEs won't: generating and maintaining technical documentation for your codebase, while also providing search with dependency-aware context to help your AI tools understand your codebase and its conventions.
claude mcp add --transport stdio trysita-autodocs docker run -i trysita/autodocs \ --env GITHUB_TOKEN="optional: set a GitHub Personal Access Token if hitting rate limits or accessing private repos"
How to use
AutoDocs provides an MCP server that enables coding agents to query a repository's automatically generated documentation and code insights. The server exposes a codebase-qna tool that lets clients ask repository-scoped questions and receive answers based on the repository’s AST parsing, dependency graph, and analysis databases produced by AutoDocs. To begin, run the MCP server (typically via Docker as described in installation steps) and point your MCP client at the server’s endpoint at /api/mcp. Include an x-repo-id header with the repository's identifier from the UI to scope queries to a specific project. The server is designed to work with common MCP tooling and supports queries that leverage the underlying FastAPI ingestion/search backend and the Next.js UI that AutoDocs provides for exploration and chat.
The MCP endpoint exposes a codebase-qna tool that can answer questions like:
- What functions define a particular feature or module in this repo?
- Where is a given symbol defined and where is it used across the codebase?
- What are the dependencies and call graphs for a specific file or module?
For setup guides and integrations with popular clients (Claude, Cursor, Continue), refer to the official docs linked in the project notes. When using the MCP API, ensure you point to the MCP path at http://localhost:3000/api/mcp (or your deployment URL) and pass the repository context via the x-repo-id header to receive precise, repository-scoped results.
How to install
Prerequisites
- Docker and Docker Compose installed on your machine
- Optional: pnpm (Node.js 20+ recommended) and uv (Python package manager) if you plan to run non-Docker flows locally
Installation steps
- Install prerequisites
- Install Docker: follow https://docs.docker.com/engine/install/
- Install Docker Compose (if not included with your Docker installation)
- Optional development tools: pnpm and uv per the project docs
- Clone the repository (if you haven’t already)
- git clone https://github.com/TrySita/AutoDocs.git
- cd AutoDocs
- Run the MCP server (Docker-based)
- Ensure Docker is running
- Start the services via Docker (as recommended by the project):
docker compose up -d
This will bring up the API, web UI, and database components as described in the project notes. If you want to see logs live, run:
docker compose up
- Verify the MCP endpoint
- Open http://localhost:3000/api/mcp to confirm the MCP server is reachable and note the required x-repo-id header for repository-scoped queries.
- Optional: Development workflow without Docker Compose
- If you prefer running API + Web + DB locally without Docker Compose, follow the project’s dev.sh script guidance:
./tools/dev.sh --sync
Prerequisites for local development (non-Docker):
- Node.js and pnpm for the web UI
- Python and uv for ingestion and API services
- A local PostgreSQL database if not using the provided containers
Additional notes
Tips and known considerations:
- The MCP server expects repository content to be accessible and indexed by AutoDocs; ensure the repo is ingested and analyzed before querying the MCP endpoint.
- If you encounter networking issues, verify that the MCP endpoint URL matches your deployment (e.g., http://localhost:3000/api/mcp) and that the x-repo-id header corresponds to a valid repository in your UI.
- Environment variables like GITHUB_TOKEN are optional but helpful for avoiding API rate limits or accessing private repos; do not commit tokens to source control.
- The project supports multiple languages; ensure the target repository is primarily recognized by the codebase and analyzed databases for accurate results.
- If using Docker, ensure you’re using the correct image tag or latest image as maintained by TrySita; consult the repo for the recommended image name if you switch versions.
Related MCP Servers
AstrBot
Agentic IM Chatbot infrastructure that integrates lots of IM platforms, LLMs, plugins and AI feature, and can be your openclaw alternative. ✨
SearChat
Search + Chat = SearChat(AI Chat with Search), Support OpenAI/Anthropic/VertexAI/Gemini, DeepResearch, SearXNG, Docker. AI对话式搜索引擎,支持DeepResearch, 支持OpenAI/Anthropic/VertexAI/Gemini接口、聚合搜索引擎SearXNG,支持Docker一键部署。
sdk-typescript
A model-driven approach to building AI agents in just a few lines of code.
aser
Aser is a lightweight, self-assembling AI Agent frame.
mkinf
mkinf SDK to interact with mkinf hub MCP servers
AgentNexus
Multi-Agent,MCP,RAG,SpringAI1.0.0,RE-ACT