In-Memoria
Persistent Intelligence Infrastructure for AI Agents
claude mcp add --transport stdio pi22by7-in-memoria npx in-memoria server
How to use
In Memoria is an MCP server that learns from your codebase and provides persistent context to AI assistants. It analyzes your project to infer patterns, conventions, and architecture so tools like Claude or Copilot can query a memory of your codebase instead of re-analyzing it every session. You can first teach it about your project, then run the server to enable seamless context sharing across sessions. The server exposes a set of specialized tools for analyzing code, routing requests to relevant files, and returning concise project context (typically under 200 tokens) to your AI agent. Use cases include getting structural overviews, locating where certain patterns live, and routing vague feature requests (like “add password reset”) to specific files based on your repository’s conventions.
How to install
Prerequisites:
- Node.js and npm installed on your machine
- Access to the npm registry (public internet)
Installation steps:
-
Install In Memoria globally (optional, but recommended): npm install -g in-memoria
-
Verify installation or use via npx: npx in-memoria --help
-
Start the MCP server (from your project directory or globally installed package): npx in-memoria server
-
Optionally learn a codebase before serving context: npx in-memoria learn ./path/to/your/project
-
Integrate with your AI tool (example for Claude or Copilot):
- Claude: add the MCP server to your configuration using the provided mcpServers entry
- Copilot: use the standard MCP integration workflow to route requests to the in-memoria server
Notes:
- The server is designed to run locally on your machine; data is stored locally (SQLite/ SurrealDB as described in the project docs).
- You can still run npx in-memoria server without a prior learn step; learning can be triggered automatically if needed.
Additional notes
Tips and common issues:
- Ensure your project path is accessible and readable by the MCP server when running learn.
- If you encounter network or permission issues with npm/npx, check your NODE_PATH and npm config.
- The memory persists across sessions for the same project, enabling cross-session context and task tracking.
- Use the learn step first to populate patterns and architecture before relying on instant project context.
- If the repository uses multiple languages, ensure the server's language parsers (e.g., Tree-sitter bindings) cover your codebase.
- When integrating with AI tools, reference the server as a memory source to improve routing and suggestion relevance.
Related MCP Servers
robloxstudio
Create agentic AI workflows in ROBLOX Studio
mie
Persistent memory graph for AI agents. Facts, decisions, entities, and relationships that survive across sessions, tools, and providers. MCP server — works with Claude, Cursor, ChatGPT, and any MCP client.
cadre-ai
Your AI agent squad for Claude Code. 17 specialized agents, persistent memory, desktop automation, and a common sense engine.
chronos-protocol
A robust MCP server that eliminates temporal blindness in AI coding agents through intelligent time tracking, persistent memory, and complete session traceability
mcpman
The package manager for MCP servers — install, manage & monitor across Claude Desktop, Cursor, VS Code, Windsurf
ade.boostrix
Модуль предоставляет MCP-сервер с набором специализированных инструментов для упрощения AI-разработки, а так же необходимый контекст и структуру для генерации качественного кода под экосистему 1С-Битрикс.