Get the FREE Ultimate OpenClaw Setup Guide →

qdrant-loader

Enterprise-ready vector database toolkit for building searchable knowledge bases from multiple data sources. Supports multi-project management, automatic ingestion from Confluence/JIRA/Git, intelligent file conversion (PDF/Office/images), and semantic search. Includes MCP server for seamless AI assistant integration.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio martin-papy-qdrant-loader mcp-qdrant-loader \
  --env QDRANT_URL="http://localhost:6333" \
  --env LLM_BASE_URL="https://api.openai.com/v1" \
  --env LLM_PROVIDER="openai" \
  --env LLM_CHAT_MODEL="gpt-4o-mini" \
  --env OPENAI_API_KEY="your_openai_key" \
  --env LLM_EMBEDDING_MODEL="text-embedding-3-small" \
  --env QDRANT_COLLECTION_NAME="my_docs"

How to use

QDrant Loader MCP Server provides an integration point for AI development tools to perform semantic search, advanced document reasoning, and knowledge graph-style interactions over your Qdrant-backed data. It exposes an MCP 2025-06-18 compliant server with HTTP transport and streaming capabilities via Server-Sent Events (SSE), enabling real-time results in supported clients. Use it together with tools like Cursor, Windsurf, or other MCP-enabled environments to search across loaded documents, leverage hierarchy-aware search, and discover cross-document relationships. The server expects environment configuration for your QDrant instance and LLM provider, and can be driven from clients like Cursor by pointing to the MCP endpoint and providing authentication/session details as needed. Typical workflows include configuring your workspace, loading data into QDrant, and starting the MCP server to expose search capabilities to your AI tools. You can also integrate via Cursor configuration by specifying the command to launch the MCP server and the necessary environment variables, or by using a configuration file with explicit environment settings and optional arguments for custom config paths.

How to install

Prerequisites:

  • Python 3.8+ and pip
  • Access to a running Qdrant instance (or a reachable Qdrant endpoint)
  • Optional: a configured OpenAI API key or another LLM provider

Install the packages (Both packages for full functionality):

pip install qdrant-loader qdrant-loader-mcp-server

Or install individually (for data ingestion or MCP server only):

pip install qdrant-loader          # Data ingestion only
pip install qdrant-loader-mcp-server  # MCP server only

Quick Start (example):

# 1) Create a workspace and ingest data (examples may vary by project)
mkdir my-workspace && cd my-workspace
qdrant-loader init --workspace .
# 2) Load data into QDrant (adjust as needed)
qdrant-loader ingest --workspace .
# 3) Start the MCP server (requires .env or environment variables set)
mcp-qdrant-loader --env /path/to/your/.env

Environment setup tips:

  • Create a .env file or export variables in your shell for QDRANT_URL, QDRANT_COLLECTION_NAME, OPENAI_API_KEY, and LLM provider settings before starting the MCP server.

Additional notes

Tips and common points:

  • Ensure QDrant is reachable at the specified QDRANT_URL and the target collection exists.
  • When migrating configurations, use the new unified LLM configuration (global.llm.*) as recommended.
  • For Cursor integration, point to the MCP server command and environment, or use a configuration object with the appropriate env and optional args as shown in the examples.
  • If you see authentication or transport-related errors, verify that the environment variables (OPENAI_API_KEY, endpoint URLs, and access permissions) are correctly set.
  • The MCP server supports HTTP transport with security and health checks; consider enabling TLS/HTTPS in production and configuring health endpoints as needed.

Related MCP Servers

Sponsor this space

Reach thousands of developers