mcp -qdrant
An official Qdrant Model Context Protocol (MCP) server implementation
claude mcp add --transport stdio qdrant-mcp-server-qdrant uvx mcp-server-qdrant \ --env QDRANT_URL="URL of the Qdrant server (e.g., http://localhost:6333) or leave unset if using QDRANT_LOCAL_PATH" \ --env QDRANT_API_KEY="API key for the Qdrant server (if required)" \ --env COLLECTION_NAME="Name of the default collection to use" \ --env EMBEDDING_MODEL="Embedding model to use (default: sentence-transformers/all-MiniLM-L6-v2)" \ --env QDRANT_LOCAL_PATH="Path to local Qdrant database (alternative to QDRANT_URL)" \ --env EMBEDDING_PROVIDER="Embedding provider to use (default: fastembed)" \ --env TOOL_FIND_DESCRIPTION="Custom description for the find tool" \ --env TOOL_STORE_DESCRIPTION="Custom description for the store tool"
How to use
This MCP server exposes two tools built on top of Qdrant: qdrant-store and qdrant-find. The qdrant-store tool lets you store information in a specified Qdrant collection along with optional metadata, returning a confirmation message. The qdrant-find tool searches the configured collection for relevant information using a query string and returns matching results as separate messages. The server is designed to act as a semantic memory layer over Qdrant, enabling LLMs to store and retrieve contextual data efficiently. You configure the server via environment variables and invoke it through the uvx runtime, with support for multiple transport options (stdio, sse, streamable-http). To use with Claude Desktop or other clients, point your client configuration at the qdrant MCP server and provide the required environment variables for your Qdrant setup.
How to install
Prerequisites:
- Python environment with uvx installed (or Node/NPM if chosen via Smithery, but this server example uses uvx).
- Access to a running Qdrant instance or a local Qdrant database path.
Installation steps:
-
Install uvx (no global install needed if using the uvx runner directly):
- Follow the uvx installation guide: https://docs.astral.sh/uv/
-
Set required environment variables (example):
- QDRANT_URL="http://localhost:6333"
- COLLECTION_NAME="my-collection"
- EMBEDDING_MODEL="sentence-transformers/all-MiniLM-L6-v2"
- EMBEDDING_PROVIDER="fastembed"
-
Run the MCP server: uvx mcp-server-qdrant
Transport options (optional):
- To use SSE: uvx mcp-server-qdrant --transport sse
- To use streamable-http: uvx mcp-server-qdrant --transport streamable-http
Additional deployment options (Docker or Smithery are supported as described in the README):
- Docker: Build and run the container as documented in the project README.
- Smithery: Install via npx @smithery/cli install mcp-server-qdrant --client claude
Note: Ensure you do not specify both QDRANT_URL and QDRANT_LOCAL_PATH at the same time.
Additional notes
Tips and common issues:
- If you plan to expose the server publicly, ensure FASTMCP_HOST is set appropriately and that network access to your Qdrant instance is secured.
- The QDRANT_URL and QDRANT_LOCAL_PATH are mutually exclusive; choose one based on your deployment.
- Default environment values can be overridden for quick experiments (e.g., FASTMCP_DEBUG, FASTMCP_PORT).
- The embedding model is configurable via EMBEDDING_MODEL; ensure the model is compatible with your embedding provider.
- When running in Docker, map port 8000 (or the port configured by FASTMCP_PORT) to the host and set FASTMCP_HOST to 0.0.0.0 for container access.
- To customize tool descriptions for Claude Desktop, modify TOOL_STORE_DESCRIPTION and TOOL_FIND_DESCRIPTION in the environment or settings.py as needed.
Related MCP Servers
fastapi_mcp
Expose your FastAPI endpoints as Model Context Protocol (MCP) tools, with Auth!
lc2mcp
Convert LangChain tools to FastMCP tools
asterisk
Asterisk Model Context Protocol (MCP) server.
janee
Secrets management for AI agents via MCP • @janeesecure
mcp -mattermost
MCP server for Mattermost — let Claude, Cursor, and other AI assistants work with channels, messages, and files
mcp -templates
A flexible platform that provides Docker & Kubernetes backends, a lightweight CLI (mcpt), and client utilities for seamless MCP integration. Spin up servers from templates, route requests through a single endpoint with load balancing, and support both deployed (HTTP) and local (stdio) transports — all with sensible defaults and YAML-based configs.