directory-indexer
Directory Indexer MCP Server - A local MCP Server for indexing your local directories into a knowledgebase for your AI Assistants.
claude mcp add --transport stdio peteretelej-directory-indexer npx directory-indexer@latest serve \ --env OLLAMA_ENDPOINT="http://localhost:11434" \ --env QDRANT_ENDPOINT="http://localhost:6333" \ --env DIRECTORY_INDEXER_DATA_DIR="/path/to/your/data"
How to use
Directory Indexer turns your local directories into an AI-powered knowledge base. It uses an embedding provider (such as Ollama) to create vector representations of your files and stores them in a Qdrant vector database for fast semantic search. Once indexed, your AI assistant can query and retrieve relevant content from your files using natural language, allowing you to ask questions like, Find API authentication examples or Show me error handling patterns, and get results drawn from your actual documents. The MCP server can be launched alongside your AI assistant, and indexing can run independently of the MCP startup, so you don’t have to wait for indexing to finish before using the search capability.
To use it with an AI assistant, configure the MCP with the directory-indexer serve command, index your directories, and then interact with your assistant to perform semantic search over the indexed content. You can organize workspaces for focused searches and customize endpoints and data directories via environment variables in the MCP configuration.
How to install
Prerequisites:
- Docker (recommended for Qdrant and Ollama) and/or native installations if desired
- Node.js 18+ installed on your system
-
Install dependencies and set up prerequisites
- Install Docker from the official site: https://docs.docker.com/get-docker/
- Install Node.js 18+ from https://nodejs.org/
-
Run Qdrant vector database (Docker option) docker run -d --name qdrant -p 127.0.0.1:6333:6333 -v qdrant_storage:/qdrant/storage qdrant/qdrant
-
Run Ollama embedding service (Docker option or native installation) docker run -d --name ollama -p 127.0.0.1:11434:11434 -v ollama:/root/.ollama ollama/ollama
Pull embedding model
docker exec ollama ollama pull nomic-embed-text
-
Install the Directory Indexer package (via npm) npm install -g directory-indexer@latest
-
Index your directories (example) npx directory-indexer@latest index ./WorkNotes ./Projects
-
Start the MCP server to serve the index (example) npx directory-indexer@latest serve
-
Optional: configure MCP integration with your AI assistant using the provided MCP config example: { "mcpServers": { "directory-indexer": { "command": "npx", "args": ["directory-indexer@latest", "serve"] } } }
Additional notes
Environment variables and configuration tips:
- DIRECTORY_INDEXER_DATA_DIR: path where the index data will be stored (default is ~/.directory-indexer)
- QDRANT_ENDPOINT: endpoint URL for the Qdrant vector database (default http://localhost:6333)
- OLLAMA_ENDPOINT: endpoint URL for the Ollama embedding service (default http://localhost:11434)
Common issues:
- If indexing runs slowly, ensure Ollama and Qdrant are reachable at their endpoints and that the embedding model is available.
- On Windows, you may need to adapt file paths for workspaces and ensure MCP config uses proper escaping.
- If the MCP server fails to start, check that the port mappings (6333 for Qdrant and 11434 for Ollama) are not in use by other processes.
Workspaces and advanced usage:
- You can define multiple workspaces to constrain searches to specific directories.
- Use the provided examples to customize commands and environment variables for your setup.
Related MCP Servers
obsidian -tools
Add Obsidian integrations like semantic search and custom Templater prompts to Claude or any MCP client.
mcp
Octopus Deploy Official MCP Server
furi
CLI & API for MCP management
mcp -arangodb
This is a TypeScript-based MCP server that provides database interaction capabilities through ArangoDB. It implements core database operations and allows seamless integration with ArangoDB through MCP tools. You can use it wih Claude app and also extension for VSCode that works with mcp like Cline!
context
Self-hosted MCP server for your documentation
CodeRAG
Advanced graph-based code analysis for AI-assisted software development