Get the FREE Ultimate OpenClaw Setup Guide →

mcp-memory-cache

MCP server from tosin2013/mcp-memory-cache-server

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio tosin2013-mcp-memory-cache-server node /path/to/build/index.js \
  --env MAX_MEMORY="Maximum memory usage in bytes (default shown in config.json)" \
  --env CONFIG_PATH="Optional path to a custom config.json (if provided, server will use this file)" \
  --env DEFAULT_TTL="Default time-to-live in seconds for cached items (default shown in config.json)" \
  --env MAX_ENTRIES="Maximum number of items in cache (default shown in config.json)" \
  --env CHECK_INTERVAL="How often to clean expired items in milliseconds (default shown in config.json)" \
  --env STATS_INTERVAL="How often to update cache statistics in milliseconds (default shown in config.json)"

How to use

Memory Cache Server helps reduce token usage by transparently caching data between interactions with language models via any MCP client. Once running, it stores frequently accessed data such as file contents, computation results, and other frequently requested information, and serves subsequent requests from memory instead of re-reading or recomputing. This leads to faster responses and lower token consumption when your MCP client interacts with language models. To use it, run the server through your chosen MCP client configuration (the server starts automatically when the MCP client connects). The server exposes caching behavior automatically; you don’t need to modify client code to benefit from token savings. You can fine-tune behavior via config.json or environment variables to adjust cache size, memory usage, TTLs, and cleanup intervals. Examples in the README illustrate typical scenarios like file content caching, repeated computations, and frequently accessed data retrieval, all leveraging the cache without explicit client-side changes.

How to install

Prerequisites:

  • Node.js and npm installed on your system
  • Access to the MCP client where you want to enable memory caching

Manual installation steps:

  1. Clone the repository:
git clone https://github.com/tosin2013/mcp-memory-cache-server.git
cd mcp-memory-cache-server
  1. Install dependencies:
npm install
  1. Build the project:
npm run build
  1. Add to your MCP client settings (example):
{
  "mcpServers": {
    "memory-cache": {
      "command": "node",
      "args": ["/path/to/build/index.js"]
    }
  }
}
  1. Start using your MCP client — the server will automatically start when you use the MCP client. If you prefer Smithery installation:
npx -y @smithery/cli install @tosin2013/mcp-memory-cache-server --client claude

Additional notes

Tips and notes:

  • You can override default settings with a config.json or environment variables as shown in the README. Common env vars: MAX_ENTRIES, MAX_MEMORY, DEFAULT_TTL, CHECK_INTERVAL, STATS_INTERVAL. Ensure values fit your workload and available system memory.
  • The server reports status via the console when running (look for messages like 'Memory Cache MCP server running on stdio').
  • If you encounter cache misses, consider adjusting TTLs or increasing maxEntries/maxMemory to fit your use case.
  • To verify working caching, perform repeated reads or analyses and observe faster responses and consistent results within TTLs.
  • You can optionally point the server to a custom config file by setting CONFIG_PATH or by placing a config.json in the server directory; environment variable overrides take precedence over config.json.
  • Ensure your MCP client configuration uses the memory-cache server entry with appropriate command/args and environment if needed.

Related MCP Servers

Sponsor this space

Reach thousands of developers