Get the FREE Ultimate OpenClaw Setup Guide →

In-Memoria

Persistent Intelligence Infrastructure for AI Agents

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio pi22by7-in-memoria npx in-memoria server

How to use

In Memoria is an MCP server that learns from your codebase and provides persistent context to AI assistants. It analyzes your project to infer patterns, conventions, and architecture so tools like Claude or Copilot can query a memory of your codebase instead of re-analyzing it every session. You can first teach it about your project, then run the server to enable seamless context sharing across sessions. The server exposes a set of specialized tools for analyzing code, routing requests to relevant files, and returning concise project context (typically under 200 tokens) to your AI agent. Use cases include getting structural overviews, locating where certain patterns live, and routing vague feature requests (like “add password reset”) to specific files based on your repository’s conventions.

How to install

Prerequisites:

  • Node.js and npm installed on your machine
  • Access to the npm registry (public internet)

Installation steps:

  1. Install In Memoria globally (optional, but recommended): npm install -g in-memoria

  2. Verify installation or use via npx: npx in-memoria --help

  3. Start the MCP server (from your project directory or globally installed package): npx in-memoria server

  4. Optionally learn a codebase before serving context: npx in-memoria learn ./path/to/your/project

  5. Integrate with your AI tool (example for Claude or Copilot):

    • Claude: add the MCP server to your configuration using the provided mcpServers entry
    • Copilot: use the standard MCP integration workflow to route requests to the in-memoria server

Notes:

  • The server is designed to run locally on your machine; data is stored locally (SQLite/ SurrealDB as described in the project docs).
  • You can still run npx in-memoria server without a prior learn step; learning can be triggered automatically if needed.

Additional notes

Tips and common issues:

  • Ensure your project path is accessible and readable by the MCP server when running learn.
  • If you encounter network or permission issues with npm/npx, check your NODE_PATH and npm config.
  • The memory persists across sessions for the same project, enabling cross-session context and task tracking.
  • Use the learn step first to populate patterns and architecture before relying on instant project context.
  • If the repository uses multiple languages, ensure the server's language parsers (e.g., Tree-sitter bindings) cover your codebase.
  • When integrating with AI tools, reference the server as a memory source to improve routing and suggestion relevance.

Related MCP Servers

Sponsor this space

Reach thousands of developers