work-memory
Never lose context again - persistent memory management system for AI-powered workflows across multiple tools
claude mcp add --transport stdio moontmsai-work-memory-mcp node /PATH/work-memory/dist/index.js \ --env NODE_ENV="production" \ --env LOG_LEVEL="WARN" \ --env WORK_MEMORY_DIR="/PATH/work-memory/data/"
How to use
Work Memory MCP Server is designed to manage and share work context across multiple AI tools by persisting memories, todos, and session data in a SQLite database. It provides five integrated tools: Memory for creating and organizing memories and tasks, Search for full-text and filtered retrieval with keyword suggestions, Session for project-specific work sessions with automatic detection, History for change tracking and versioning, and System for monitoring and optimization. With this server running, you can query, update, and link memories to sessions, search across memories and todos, and maintain a consistent context as you switch between Claude Desktop, Cursor AI, or other tools that integrate via MCP. The system supports persistence, multi-dimensional tagging and prioritization, session-based context, and automatic history/version management to help you maintain an organized, searchable knowledge base.
To use the server, ensure it is running with the configured Node.js entry point. The Memory tool enables CRUD operations on memories and todos, the Search tool helps you locate relevant items by keywords and filters, the Session tool manages project-based workspaces, the History tool lets you view and restore prior versions, and the System tool provides health checks and maintenance capabilities. You can connect Claude Desktop or Cursor AI through the supplied MCP configuration, enabling seamless continuity of work context across tools and sessions.
How to install
Prerequisites:
- Node.js 18.0.0 or higher
- npm 8.0.0 or higher
- git
Step-by-step installation:
- Clone the repository git clone https://github.com/your-repo/work-memory-mcp.git
- Navigate to the project directory cd work-memory-mcp
- Install dependencies npm install
- Build the project (if required by the project setup) npm run build
- Start the MCP server npm start
Configuration tip:
- Adjust the mcp_config to point to your actual built entry at dist/index.js and set the environment directories as needed for WORK_MEMORY_DIR and logs.
Additional notes
Environment and configuration tips:
- The provided mcp_config uses a Node.js entry at /PATH/work-memory/dist/index.js. Update this path to the actual build output path in your environment.
- WORK_MEMORY_DIR sets the SQLite storage location. If this is not set, the default storage behavior applies; ensure the directory exists and is writable by the running process.
- LOG_LEVEL controls verbosity; common values include INFO, WARN, and DEBUG. NODE_ENV is typically production for deployments.
- The README notes that Cache memory usage (50MB) is hardcoded; there is currently no environment variable to change it.
- Claude Desktop and Cursor AI can connect via MCP: for Claude Desktop, use the provided configuration template and update paths to point to your server instance.
- If you modify the server address or port, you may need to update client configurations accordingly.
- Regular backups are recommended since this system persists memories to SQLite; consider adding a periodic backup routine.
- Ensure Node.js processes have sufficient file system permissions for the WORK_MEMORY_DIR directory.
Related MCP Servers
cheatengine -bridge
Connect Cursor, Copilot & Claude directly to Cheat Engine via MCP. Automate reverse engineering, pointer scanning, and memory analysis using natural language.
fast-filesystem
A high-performance Model Context Protocol (MCP) server that provides secure filesystem access for Claude and other AI assistants.
PackageFlow
A visual DevOps hub for npm scripts, Git, workflows, and deploy — controllable by AI via MCP.
github-to
Convert GitHub repositories to MCP servers automatically. Extract tools from OpenAPI, GraphQL & REST APIs for Claude Desktop, Cursor, Windsurf, Cline & VS Code. AI-powered code generation creates type-safe TypeScript/Python MCP servers. Zero config setup - just paste a repo URL. Built for AI assistants & LLM tool integration.
cheatengine -bridge
🔗 Connect AI to Cheat Engine for fast memory analysis, enabling quick mods and audits without the tedious manual work.
openapi -swagger
Solve AI context window limits for API docs | Convert any Swagger/OpenAPI to searchable MCP server | AI-powered endpoint discovery & code generation | Works with Cursor, Claude, VS Code