mcp-memory-cache
MCP server from tosin2013/mcp-memory-cache-server
claude mcp add --transport stdio tosin2013-mcp-memory-cache-server node /path/to/build/index.js \ --env MAX_MEMORY="Maximum memory usage in bytes (default shown in config.json)" \ --env CONFIG_PATH="Optional path to a custom config.json (if provided, server will use this file)" \ --env DEFAULT_TTL="Default time-to-live in seconds for cached items (default shown in config.json)" \ --env MAX_ENTRIES="Maximum number of items in cache (default shown in config.json)" \ --env CHECK_INTERVAL="How often to clean expired items in milliseconds (default shown in config.json)" \ --env STATS_INTERVAL="How often to update cache statistics in milliseconds (default shown in config.json)"
How to use
Memory Cache Server helps reduce token usage by transparently caching data between interactions with language models via any MCP client. Once running, it stores frequently accessed data such as file contents, computation results, and other frequently requested information, and serves subsequent requests from memory instead of re-reading or recomputing. This leads to faster responses and lower token consumption when your MCP client interacts with language models. To use it, run the server through your chosen MCP client configuration (the server starts automatically when the MCP client connects). The server exposes caching behavior automatically; you don’t need to modify client code to benefit from token savings. You can fine-tune behavior via config.json or environment variables to adjust cache size, memory usage, TTLs, and cleanup intervals. Examples in the README illustrate typical scenarios like file content caching, repeated computations, and frequently accessed data retrieval, all leveraging the cache without explicit client-side changes.
How to install
Prerequisites:
- Node.js and npm installed on your system
- Access to the MCP client where you want to enable memory caching
Manual installation steps:
- Clone the repository:
git clone https://github.com/tosin2013/mcp-memory-cache-server.git
cd mcp-memory-cache-server
- Install dependencies:
npm install
- Build the project:
npm run build
- Add to your MCP client settings (example):
{
"mcpServers": {
"memory-cache": {
"command": "node",
"args": ["/path/to/build/index.js"]
}
}
}
- Start using your MCP client — the server will automatically start when you use the MCP client. If you prefer Smithery installation:
npx -y @smithery/cli install @tosin2013/mcp-memory-cache-server --client claude
Additional notes
Tips and notes:
- You can override default settings with a config.json or environment variables as shown in the README. Common env vars: MAX_ENTRIES, MAX_MEMORY, DEFAULT_TTL, CHECK_INTERVAL, STATS_INTERVAL. Ensure values fit your workload and available system memory.
- The server reports status via the console when running (look for messages like 'Memory Cache MCP server running on stdio').
- If you encounter cache misses, consider adjusting TTLs or increasing maxEntries/maxMemory to fit your use case.
- To verify working caching, perform repeated reads or analyses and observe faster responses and consistent results within TTLs.
- You can optionally point the server to a custom config file by setting CONFIG_PATH or by placing a config.json in the server directory; environment variable overrides take precedence over config.json.
- Ensure your MCP client configuration uses the memory-cache server entry with appropriate command/args and environment if needed.
Related MCP Servers
zen
Selfhosted notes app. Single golang binary, notes stored as markdown within SQLite, full-text search, very low resource usage
MCP -Deepseek_R1
A Model Context Protocol (MCP) server implementation connecting Claude Desktop with DeepSeek's language models (R1/V3)
mcp-fhir
A Model Context Protocol implementation for FHIR
mcp
Inkdrop Model Context Protocol Server
mcp-appium-gestures
This is a Model Context Protocol (MCP) server providing resources and tools for Appium mobile gestures using Actions API..
dubco -npm
The (Unofficial) dubco-mcp-server enables AI assistants to manage Dub.co short links via the Model Context Protocol. It provides three MCP tools: create_link for generating new short URLs, update_link for modifying existing links, and delete_link for removing short links.