qdrant
MCP server for semantic search using local Qdrant vector database and OpenAI embeddings
claude mcp add --transport stdio mhalder-qdrant-mcp-server node build/index.js \ --env QDRANT_URL="http://localhost:6333" \ --env COHERE_API_KEY="your-cohere-api-key" \ --env OPENAI_API_KEY="your-openai-api-key" \ --env VOYAGE_API_KEY="your-voyage-api-key" \ --env EMBEDDING_PROVIDER="ollama"
How to use
This MCP server provides semantic search capabilities over a local Qdrant vector store with multiple embedding providers. By default it uses Ollama for embedding, enabling privacy-first, local embeddings without external API keys. It exposes a suite of tools for managing collections, documents, code indexing, and git history, including semantic and hybrid search, incremental indexing, and structured logging. You can run the server locally via stdio or expose it over HTTP for remote access, and configure prompts to guide workflows. The built-in tooling supports creating and listing collections, adding and searching documents, indexing codebases with AST-aware chunking, and querying git history for semantic insights. You can switch embedding providers (OpenAI, Cohere, Voyage AI) by configuring environment variables such as EMBEDDING_PROVIDER and provider-specific keys, while Qdrant stores the embeddings and documents locally or remotely if configured. For best results, ensure your environment has Node.js 22.x or 24.x and a container runtime (Podman or Docker) available for embedding model pulls and any required services.
How to install
Prerequisites:
- Node.js 22.x or 24.x
- npm (bundled with Node.js)
- Git
- Podman or Docker (for local embedding model and containerized services)
Install and run locally:
- Clone the repository
git clone https://github.com/mhalder/qdrant-mcp-server.git
cd qdrant-mcp-server
- Install dependencies
npm install
- Build the server
npm run build
- Start using stdio transport (local, in-process) or configure HTTP transport as needed
# Example: run the built server via Node.js (stdio transport)
node build/index.js
Optional: Start with remote HTTP transport (example)
TRANSPORT_MODE=http HTTP_PORT=3000 node build/index.js
- Register the MCP server with Claude (example)
claude mcp add --transport stdio qdrant -- node /path/to/qdrant-mcp-server/build/index.js
Configuration notes: You can also add the server to your Claude config file (~/.claude.json) under mcpServers with type, command, and args as shown in the README. For secured Qdrant or cloud deployments, supply API keys via environment variables as described in the README.
Additional notes
Tips and caveats:
- If you plan to use HTTP transport in production, place the server behind a reverse proxy (nginx, Caddy) with HTTPS and implement authentication/authorization at the proxy level.
- When using OpenAI/Cohere/Voyage providers, set EMBEDDING_PROVIDER and the corresponding API keys in environment variables.
- For local embeddings and privacy, Ollama is the default provider; ensure Ollama is running and the nomic-embed-text model is pulled (podman/docker pull commands are available in the quick start).
- The server supports incremental indexing and structured logging (JSON via Pino). Enable verbose logs for debugging if you encounter issues.
- If you switch transport modes or providers, restart the server to pick up the new configuration.
Related MCP Servers
claude-self-reflect
Claude forgets everything. This fixes that. 🔗 www.npmjs.com/package/claude-self-reflect
mcp-memory-libsql
🧠 High-performance persistent memory system for Model Context Protocol (MCP) powered by libSQL. Features vector search, semantic knowledge storage, and efficient relationship management - perfect for AI agents and knowledge graph applications.
obsidian
MCP server for Obsidian vault management - enables Claude and other AI assistants to read, write, search, and organize your notes
opencode-personal-knowledge
🧠 Personal knowledge MCP server with vector database for Opencode. Store and retrieve knowledge using semantic search, powered by local embeddings.
elasticsearch-memory
🧠 Elasticsearch-powered MCP server with hierarchical memory categorization, intelligent auto-detection, and batch review capabilities
mcp-discovery
"MCP Discovery - The World's Largest MCP Server Index 14,000+ servers | Semantic search API | Real-time discovery Live API: https://mcp-discovery-two.vercel.app"