Get the FREE Ultimate OpenClaw Setup Guide →

Cursor-history

API service to search vectorized Cursor IDE chat history using LanceDB and Ollama

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio markelaugust74-cursor-history-mcp docker run -i -p 8001:8001 -v /path/to/your/cursor_chat_history.lancedb:/data/cursor_chat_history.lancedb -e OLLAMA_HOST=http://host.docker.internal:11434 cursor-chat-search-api

How to use

Cursor-history MCP provides a Dockerized FastAPI server that exposes a simple API to search your vectorized Cursor chat history. After running the extraction script to generate a LanceDB database of prompts and embeddings, you start the API in a container. The server exposes endpoints such as POST /search_chat_history for vector similarity search and GET /health to verify connectivity to Ollama and LanceDB. You can use these endpoints to perform Retrieval Augmented Generation (RAG) tasks by passing a query and receiving matching historical prompts with associated metadata. The workflow is designed to keep computation local: embeddings are generated using a local Ollama model (nomic-embed-text) and stored in LanceDB, with the API serving search results from that database.

How to install

Prerequisites:\n- Docker Desktop (Windows/Mac) or Docker Engine (Linux) installed and running.\n- Python 3.7+ (for extraction script, if you plan to run the extractor locally).\n- Ollama installed and running with nomic-embed-text:latest model available.\n\nInstall and set up steps:\n1) Clone the repository and navigate to the project root.\n2) If you plan to run the extraction locally:\n - Install Python dependencies for the extractor:\n pip install -r requirements.txt\n - Ensure WORKSPACE_STORAGE_PATH in cursor_history_extractor.py points to your Cursor data.\n - Start Ollama and verify nomic-embed-text:latest is available: ollama pull nomic-embed-text:latest; ollama run nomic-embed-text:latest.\n3) Run the extraction script to create the LanceDB database (host machine):\n python cursor_history_extractor.py\n This creates ./cursor_chat_history.lancedb on the host.\n4) Build and run the API Docker container (as shown in the MCP config):\n docker build -t cursor-chat-search-api .\n docker run -i -p 8001:8001 -v /path/to/your/cursor_chat_history.lancedb:/data/cursor_chat_history.lancedb -e OLLAMA_HOST="http://host.docker.internal:11434" cursor-chat-search-api\n\nNotes:\n- The volume mount must point to the absolute path of the LanceDB directory on the host so the container can access the database.\n- If running on Linux, adjust OLLAMA_HOST and networking as needed to allow the container to reach the Ollama instance.\n- Ensure the LanceDB table used is named chat_history and that the database path inside the container matches /data/cursor_chat_history.lancedb.

Additional notes

Tips and common issues:\n- Ensure Ollama is reachable from inside the Docker container. If you run Ollama on the host, use the recommended host networking or the host.docker.internal alias (Windows/Mac).\n- The extraction script writes to ./cursor_chat_history.lancedb by default; make sure you have write permissions in the target directory.\n- The API exposes /health to check connections to Ollama and LanceDB; use it to confirm that the database is open and embeddings can be retrieved.\n- When mounting volumes on Windows, use forward slashes and absolute paths, e.g. C:/path/to/cursor_chat_history.lancedb:/data/cursor_chat_history.lancedb.\n- If you change the LanceDB directory, update the mapping in the Docker run command accordingly.\n- The embedding model nomic-embed-text:latest defaults to 768 dimensions; ensure the model version in Ollama matches what the extractor expects.

Related MCP Servers

Sponsor this space

Reach thousands of developers