mode-manager
MCP Memory Agent Server - A VS Code chatmode and instruction manager with library integration
claude mcp add --transport stdio niclasolofsson-mode-manager-mcp uvx mode-manager-mcp
How to use
Mode Manager MCP provides AI-powered memory and context management for developers and teams. It lets you store and retrieve personal, workspace (team), and language-specific memories so your AI assistant remains aligned with your preferences and conventions. The server supports three memory scopes: personal memory stored in your user prompts directory, workspace memory shared within the repo, and language-specific memory loaded when you're working in a particular language. You can also onboard quickly with the interactive onboarding flow to persist memory from the start, and you can access onboarding prompts directly in VS Code. The toolset focuses on making memory persistent and context-aware, reducing repeated questions and helping Copilot follow your team’s guidelines and personal preferences.
How to install
Prerequisites:
- Python 3.10 or higher
- Access to the internet to install packages
Installation steps:
- Install Python from python.org if you don’t already have it.
- Install the uv tool:
- With pip: pip install uv
- Verify installation:
- uv --version
- Run the MCP server via uvx (example used by this project):
- uvx mode-manager-mcp
Configuration examples (these can be placed in a global mcp.json or your VS Code workspace .vscode/mcp.json):
Option: Run directly with uvx
{
"servers": {
"mode-manager": {
"command": "uvx",
"args": ["mode-manager-mcp"]
}
}
}
Option: Development / latest from GitHub (using uvx)
{
"servers": {
"mode-manager": {
"command": "uvx",
"args": [
"--from",
"git+https://github.com/NiclasOlofsson/mode-manager-mcp.git",
"mode-manager-mcp"
]
}
}
}
Option: Development / latest from GitHub (using pipx)
{
"servers": {
"mode-manager": {
"command": "pipx",
"args": [
"run",
"--no-cache",
"--spec",
"git+https://github.com/NiclasOlofsson/mode-manager-mcp.git",
"mode-manager-mcp"
]
}
}
}
If you want to run directly from GitHub during development, you can also set environment variables as shown in the readme snippet (for example, _RESTART=1) when using uvx or pipx.
Additional notes
Tips and notes:
- Memory is stored in Markdown files with a YAML frontmatter header for easy human and machine parsing.
- Personal memory: memory.instructions.md in your VS Code prompts directory.
- Workspace memory: memory.instructions.md in the workspace's .github/instructions directory.
- Language-specific memory: memory-<language>.instructions.md files loaded automatically when editing that language.
- When testing or running from GitHub, you may opt to pull directly from the repository to get the latest features.
- If you encounter issues starting the server in VS Code, ensure Python 3.10+ is active in your environment and that uv is installed correctly (uv --version).
- This MCP uses the uv tool to run the server; you can also use pipx to install and run the package if you prefer isolated environments.
Related MCP Servers
klavis
Klavis AI (YC X25): MCP integration platforms that let AI agents use tools reliably at any scale
automagik-genie
🧞 Automagik Genie – bootstrap, update, and roll back AI agent workspaces with a single CLI + MCP toolkit.
neurolink
Universal AI Development Platform with MCP server integration, multi-provider support, and professional CLI. Build, test, and deploy AI applications with multiple ai providers.
packt-netops-ai-workshop
🔧 Build Intelligent Networks with AI
HydraMCP
Connect agents to agents. MCP server for querying any LLM through your existing subscriptions: compare, vote, and synthesize across GPT, Gemini, Claude, and local models from one terminal.
mcpdir
The largest open MCP server directory — 8,000+ community-driven Model Context Protocol servers. Open-source, fully searchable.