Cursor-history
🚀 Extract and vectorize your Cursor chat history, enabling efficient search through a Dockerized FastAPI API with LanceDB integration.
claude mcp add --transport stdio pedrohenrique316-cursor-history-mcp docker run -i pedrohenrique316/cursor-history-mcp
How to use
Cursor-history-MCP is a Python-based FastAPI application that lets you efficiently search through your Cursor IDE chat history using vectorized data. It leverages LanceDB for local, lightweight vector storage and provides a simple web interface to query and view results. The MCP server exposes the containerized application, enabling you to run it locally in a Docker environment to keep your data private. Once running, you can access the local web UI and perform fast, keyword-driven searches across your chat messages, with results that you can click to view the full context. The tooling focuses on providing a straightforward search experience while keeping data local and secure when deployed via Docker.
How to install
Prerequisites:\n- Docker: Install Docker Desktop on your operating system.\n- Git (optional): If you are cloning the repository locally for customization.\n- Basic command-line familiarity.\n\nInstallation steps:\n1) Ensure Docker is installed and running on your system.\n2) Download the Cursor-history-MCP container image (or build locally if provided by the repo). The recommended method is to pull and run the Docker image:\n\n\n# Pull the image (if needed)\ndocker pull pedrohenrique316/cursor-history-mcp:latest\n\n# Run the container in interactive mode, mapping ports as needed (example)\ndocker run -it -p 8000:80 pedrohenrique316/cursor-history-mcp:latest\n\n3) Verify the container starts correctly and the local server is accessible at http://localhost:8000 (or the port you mapped).\n4) If you have a preconfigured data path or environment variables, supply them via docker run -e flags or a docker-compose file as documented by the project.\n\nIf you prefer not to use Docker, consult the project docs for an alternative native setup (uvicorn-based FastAPI server) and follow the corresponding steps.
Additional notes
Notes and tips:\n- Docker is required for running the application container as per the project requirements. Ensure Docker has enough memory (at least 4GB RAM) and disk space for the database file.\n- If you run into connectivity issues, verify port mappings and that the container is healthy.\n- For local data privacy, point the application to a local data directory and avoid mounting sensitive paths publicly.\n- If the UI cannot connect to the backend, check that the server inside the container is listening on the expected port and that API routes are accessible.\n- Review the repository documentation inside the extracted folder for any data-path or configuration file updates to match your environment.
Related MCP Servers
klavis
Klavis AI (YC X25): MCP integration platforms that let AI agents use tools reliably at any scale
aci
ACI.dev is the open source tool-calling platform that hooks up 600+ tools into any agentic IDE or custom AI agent through direct function calling or a unified MCP server. The birthplace of VibeOps.
headroom
The Context Optimization Layer for LLM Applications
lihil
2X faster ASGI web framework for python, offering high-level development, low-level performance.
ollama -bridge
Extend the Ollama API with dynamic AI tool integration from multiple MCP (Model Context Protocol) servers. Fully compatible, transparent, and developer-friendly, ideal for building powerful local LLM applications, AI agents, and custom chatbots
prospectio-api
MCP/API server that helps you to connect to different lead generation app