cursor-cortex
Structured memory system for AI assistants. Eliminates context loss with branch notes, tacit knowledge, and project context. Local MCP integration for Cursor IDE.
claude mcp add --transport stdio flores-ac-cursor-cortex node /PATH/TO/cursor-cortex/index.js \ --env DEBUG="true"
How to use
Cursor-Cortex provides structured external memory to help AI assistants maintain context across tasks. It organizes information into Branch Notes (the ongoing development log), Context Files (project goals and architecture), Tacit Knowledge (company-specific solutions and patterns), and Checklists (structured thinking and process guidance). When enabled as an MCP server, your cursor-cortex toolset becomes available to the AI to read and update these artifacts, search across them, generate commit messages, and create comprehensive knowledge documents. You can invoke commands such as updating branch notes, reading context, generating embeddings, and creating completion checklists, among many other utilities that keep the AI oriented and methodical during software work. Optional vector search can dramatically speed semantic queries once embeddings are generated and stored locally. To use it, configure the MCP path, start Cursor-Cortex, enable MCP in Cursor Settings, and then use the available prompts to update notes, fetch context, or synthesize project narratives.
How to install
Prerequisites:
- Node.js 18+ (per the project’s vector search requirements)
- npm (comes with Node.js)
- Optional: Git for cloning the repository if you’re starting from source
Step-by-step installation:
-
Clone or download the repository to your machine.
-
Navigate to the project directory:
cd /path/to/cursor-cortex
- Install dependencies:
npm install --legacy-peer-deps
Note: The --legacy-peer-deps flag may be required due to TensorFlow.js peer dependency conflicts.
- Generate embeddings (if you plan to use vector search):
node generate-all-embeddings-cpu.js
If content changes, you can force regenerate:
node generate-all-embeddings-cpu.js --force
-
Ensure the MCP configuration path exists and is accessible, e.g. create the cursor config file at ~/.cursor/mcp.json and add the cursor-cortex section as described in the README.
-
Start the server by running the MCP command configuration (as described in the mcp_config section) or by executing the node entry point directly if you’re testing locally:
node /path/to/cursor-cortex/index.js
-
Enable the MCP server in Cursor (Cursor Settings → Features → Model Context Protocol → cursor-cortex → ON).
-
Verify the storage and endpoints are correctly initialized (look for messages indicating the storage directory and active MCP server).
Additional notes
Tips and common issues:
- Ensure the path in mcp.json is correct and accessible by the user running Cursor.
- If you see a "Tool not found" error, restart Cursor after enabling the MCP server and re-check the tool list.
- For vector search, ensure you have at least 2GB RAM available and that embeddings have been generated to ~/.cursor-cortex/embeddings/.
- If JSON configuration complains, validate syntax with a JSON validator and confirm proper quoting in mcp.json.
- The DEBUG environment variable can be used to surface extra logs during development.
- The provided example uses the Node-based entry point index.js; adapt the path to your actual build/output if different.
Related MCP Servers
spec-kit
MCP server enabling AI assistants to use GitHub's spec-kit methodology
mcp-devtools
MCP DevTools: A suite of Model Context Protocol servers enabling AI assistants to interact with developer tools and services
memory-journal
MCP server for AI Context + Project Intelligence. Overcome disconnected AI sessions with persistent project memory, triple search, knowledge graphs, and GitHub integration (Actions, Insights, Issues, Kanban Boards, Milestones, and PRs). Uses Tool Filtering. HTTP/SSE Transport Available.
flint-note
Agent-first note-taking system
mcp -playground
MCP Server example with TypeScript
mcpplay
The FastAPI /docs experience, for MCP servers.