qdrant -pi5
Persistent semantic memory for AI agents on Raspberry Pi 5 — local Qdrant + MCP, no cloud, ~3s per query
claude mcp add --transport stdio rockywuest-qdrant-mcp-pi5 mcp-server-qdrant \ --env COLLECTION_NAME="agent-memory" \ --env EMBEDDING_MODEL="sentence-transformers/all-MiniLM-L6-v2" \ --env QDRANT_LOCAL_PATH="/home/pi/.qdrant-data"
How to use
This MCP server provides a local, private memory store using Qdrant as a vector database. It runs per-call as a stateless MCP service that embeds memory items with a CPU-based embedding model and stores them in a local SQLite-backed collection. Users can store memories and later retrieve them semantically (by meaning, not exact keywords) using a simple MCP bridge (mcporter) to invoke store and find operations. The OpenClaw Hard Enforcement Plugin is available as an optional enhancement to inject memories into prompts before every response, removing the need for the LLM to decide to query memory. To use it, you integrate the server with mcporter and call the qdrant-memory endpoints (qdrant-store and qdrant-find).
How to install
Prerequisites:
- Raspberry Pi 5 with Raspberry Pi OS (64-bit) and internet access
- Python 3.8+ and pip
- Node.js and npm (for mcporter, optional if you only use Python tooling via mcporter bridge)
Step 1: Install the MCP server package (Python)
# Install the Qdrant MCP server package for memory storage
pip3 install mcp-server-qdrant
Step 2: Install mcporter (MCP client/bridge)
# Install mcporter globally (Node.js/npm required)
npm install -g mcporter
Step 3: Configure mcporter with a memory store (example)
mkdir -p ~/.mcporter
Create ~/.mcporter/mcporter.json with content like:
{
"mcpServers": {
"qdrant-memory": {
"command": "mcp-server-qdrant",
"description": "Persistent vector memory using local Qdrant storage",
"env": {
"QDRANT_LOCAL_PATH": "/home/pi/.qdrant-data",
"COLLECTION_NAME": "agent-memory"
}
}
}
}
Step 4: Run and verify
# List registered MCP servers via mcporter
mcporter list
# Store a memory
mcporter call qdrant-memory.qdrant-store \
information="The project runs on a Raspberry Pi 5 in my office"
# Search by meaning
mcporter call qdrant-memory.qdrant-find \
query="Where does the project run?"
Notes:
- The qdrant-memory server is designed to be stateless and run per-request, loading the model on the first call and exiting after each operation.
- If mcporter can't locate the server, use the full path to the command, or ensure the PATH includes the mcp-server-qdrant executable.
Additional notes
- First call may be slower (approx 3-5 seconds) as the embedding model loads. Subsequent calls reuse the process if not fully terminated.
- Ensure you use absolute paths for QDRANT_LOCAL_PATH; tilde expansions are not performed by mcporter envs.
- Data is stored locally (no cloud keys required). The embedding model runs on CPU (no GPU required).
- You can run multiple memory collections by adding additional entries under mcpServers, each with its own QDRANT_LOCAL_PATH and COLLECTION_NAME.
- The OpenClaw Hard Enforcement Plugin can automatically inject memories into prompts before responses; enable and configure it if you want enforced memory access prior to generation.
Related MCP Servers
context-sync
Local persistent memory store for LLM applications including continue.dev, cursor, claude desktop, github copilot, codex, antigravity, etc.
awesome-openclaw
Curated awesome list for OpenClaw (formerly Moltbot/Clawdbot): skills, plugins, memory systems, MCP tools, deployment stacks, ecosystem platforms, and developer tooling.
roampal-core
Outcome-based memory for Claude Code and OpenCode
mie
Persistent memory graph for AI agents. Facts, decisions, entities, and relationships that survive across sessions, tools, and providers. MCP server — works with Claude, Cursor, ChatGPT, and any MCP client.
context-harness
Local-first context ingestion and retrieval for AI tools. SQLite + embeddings + MCP server for Cursor & Claude.
Cognio
Persistent semantic memory server for MCP - Give your AI long-term memory that survives across conversations. Lightweight Python server with SQLite storage and semantic search.