amanmcp
AmanMCP is a local-first RAG MCP server for developers - providing hybrid search (BM25 + semantic) over codebases for AI assistants like Claude Code and Cursor.
claude mcp add --transport stdio aman-cerp-amanmcp amanmcp
How to use
AmanMCP runs locally on your machine to provide fast, private codebase search and retrieval through Claude AI. It indexes your project code and makes it available to Claude via the MCP protocol, enabling hybrid search (BM25 plus semantic embeddings), AST-aware chunking, and multi-language support. The Quick Start shows how to install and initialize AmanMCP so that it automatically pulls the necessary Ollama model and builds an index for your repository. Once running, you can ask Claude to search your codebase or request code/documentation queries using tools exposed by AmanMCP, such as search, search_code, and search_docs. These tools are designed for rapid, context-rich results while keeping all data on your machine.
When connected to Claude, you can issue queries like “Find the function that handles database connections” or “Show me all functions related to authentication” and Claude will route these requests through the AmanMCP index. The platform highlights that it uses a hybrid search strategy (vector-based semantic search combined with traditional BM25) and AST-aware chunking to preserve meaningful code boundaries, making results accurate and actionable across multiple languages. The documentation also points to a full command reference and configuration options if you need deeper control over indexing, backends, or performance tweaks.
How to install
Prerequisites:
- A macOS or Linux environment with a working Go toolchain (Go 1.16+ recommended) and curl/git installed.
- Ollama installed for local model serving (brew install ollama).
- Homebrew (on macOS) for easy installation of AmanMCP (brew tap Aman-CERP/tap && brew install amanmcp).
Install and setup:
- Install Ollama if you haven’t already:
- brew install ollama
- Install AmanMCP via Homebrew:
- brew tap Aman-CERP/tap
- brew install amanmcp
- Initialize the project in your codebase (this will start Ollama, pull the necessary model, and index your files):
- cd /path/to/your-project
- amanmcp init
- Ensure the AmanMCP service is running and accessible locally. You should be able to issue searches to Claude through the MCP protocol after initialization.
Notes:
- The Quick Start expects Ollama to be available and will automatically handle model loading and indexing during init.
- If you need to stop or restart, use standard process controls for the binary installed by Homebrew (amanmcp).
Additional notes
Environment variables and configuration hints:
- If you run into indexing or model loading issues, ensure Ollama is up and reachable and that the required model is present in Ollama's store.
- AmanMCP is designed for local, privacy-preserving search; ensure your codebase is not publicly exposed and that you start the MCP service in a trusted environment.
- The mcp_config currently assumes the AmanMCP binary name is available on PATH; adjust the command/args if you install it in a non-standard location.
- For larger codebases, consider enabling incremental re-indexing (as documented in guides) to keep the search index up-to-date without manual reindexing.
- If you require a non-default runtime, consult the docs for backend or environment options (e.g., ML backends, embedding setups).
Related MCP Servers
claude-talk-to-figma
A Model Context Protocol (MCP) that allows Claude Desktop and other AI tools (Claude Code, Cursor, Antigravity, etc.) to read, analyze, and modify Figma designs
mcp-local-rag
Local-first RAG server for developers using MCP. Semantic + keyword search for code and technical docs. Fully private, zero setup.
ollama
An MCP Server for Ollama
mem0 -selfhosted
Self-hosted mem0 MCP server for Claude Code. Run a complete memory server against self-hosted Qdrant + Neo4j + Ollama while using Claude as the main LLM.
statelessagent
Your AI forgets everything between sessions. SAME fixes that. Local-first, no API keys, single binary.
mcproc
A Model Context Protocol (MCP) server for comfortable background process management on AI agents.