kodit
π©βπ» MCP server to index external repositories
claude mcp add --transport stdio helixml-kodit node server.js \ --env PORT="Port to run the MCP server on (default 3000 or as configured by environment)" \ --env LOG_LEVEL="Logging verbosity (e.g., info, debug, warn)" \ --env INDEX_PATH="Path to the local code index or repository store" \ --env DATABASE_URL="Connection string for the index database (if using an external DB)" \ --env OPENAI_API_KEY="OpenAI-compatible API key if using OpenAI-compatible providers"
How to use
Kodit is an MCP server that indexes and exposes code snippets from your codebases to your AI coding assistant. Once running, it builds a searchable index of local and remote repositories, supports keyword and semantic search, and provides rich context parameters and metadata to improve the accuracy of code examples returned by your assistant. It integrates with popular AI coding assistants and can be deployed alongside other enterprise tooling. To use Kodit, start the MCP server and connect your coding assistant to the Kodit MCP endpoint. You can then issue queries like βshow me Python examples of REST clientsβ or filter results by language, repository, author, or date range. The server also supports hybrid search (BM25 with semantic) and advanced snippet extraction with context-aware dependencies to reduce hallucinations in generated code.
How to install
Prerequisites:
- Node.js (14.x or newer) and npm
- Git
- Optional: Docker if you prefer containerized deployment
- Clone the repository
- git clone https://github.com/helixml/kodit.git
- cd kodit
- Install dependencies
- npm install
- Configure environment
- Copy example env and adjust values
- cp .env.example .env
- Edit .env to set OPENAI_API_KEY, DATABASE_URL (if using an external DB), PORT, INDEX_PATH, etc.
- Run the server
- npm run build (if a build step exists in the project)
- npm start
- (Optional) Run with Docker
- Create a docker-compose.yaml or run a container image if provided by the project maintainers.
- Example (adjust image name and environment as needed): docker run -p 3000:3000 -e OPENAI_API_KEY=your_key -e INDEX_PATH=/code/index kodit:latest
- Verify
- Access the MCP endpoint at http://localhost:3000/mcp (or the port you configured) and try a sample query via your coding assistant integration.
Additional notes
Tips and common considerations:
- Environment variables: Ensure OPENAI_API_KEY (or other model provider keys) are set if you rely on external models.
- INDEX_PATH and DATABASE_URL: Use a persistent store for the index to avoid data loss between restarts. For heavy usage, consider a dedicated database or vector store as recommended by the docs.
- Privacy and security: Respect repository access controls when indexing private repos (use PATs or appropriate credentials).
- Integration: Kodit is designed to work with multiple MCP clients; verify the client-specific integration steps in the hosted docs for Cursor, Cline, Claude Code, etc.
- Updates: Regularly reindex to keep snippets up-to-date with source changes.
- Performance: For large codebases, consider incremental or scheduled reindexing to manage resource usage.
- Logging: Increase log level to debug while troubleshooting; reduce after issues are resolved to improve performance.
Related MCP Servers
mcp-language
mcp-language-server gives MCP enabled clients access semantic tools like get definition, references, rename, and diagnostics.
mcp-victoriametrics
The implementation of Model Context Protocol (MCP) server for VictoriaMetrics
mcp-difyworkflow
mcp-difyworkflow-server is an mcp server Tools application that implements the query and invocation of Dify workflows, supporting the on-demand operation of multiple custom Dify workflows.
mcp
β¨ MCP Server for Effortless Dependency Updates
mcp
Teamwork.com MCP server
muster
MCP tool management and workflow proxy