ib -cache
Memory Cache Server for use with supported MCP API Clients.
claude mcp add --transport stdio ibproduct-ib-mcp-cache-server node /path/to/ib-mcp-cache-server/build/index.js \ --env MAX_MEMORY="104857600 (bytes, e.g., 100MB)" \ --env DEFAULT_TTL="3600 (seconds, e.g., 1 hour)" \ --env MAX_ENTRIES="1000 (or your desired maximum number of items in cache)" \ --env CHECK_INTERVAL="60000 (milliseconds, e.g., 1 minute)" \ --env STATS_INTERVAL="30000 (milliseconds, e.g., 30 seconds)"
How to use
This MCP server implements a memory cache that transparently stores frequently accessed data between interactions with language models. Once running, it automatically caches data such as file contents, results of computations, and other frequently requested data, reducing token usage by avoiding re-sending identical information. To use it, add the memory-cache MCP server to your MCP client configuration (as shown in the installation guide). The server operates behind the scenes; you don’t need to modify your client calls beyond enabling the MCP server. You’ll notice improved performance when making repeated requests for the same data or queries, since the cache serves from memory when valid. The server’s behavior is governed by a configurable cache policy (maxEntries, maxMemory, default TTL, and cleanup/stats intervals) and can be overridden via environment variables or a custom config file. In practice, this means reading the same file multiple times, repeated analyses, or navigating a project’s structure will benefit from caching, leading to faster responses and reduced token consumption.
How to install
Prerequisites:
- Node.js and npm installed
- Access to the repository (Git)
Steps:
- Clone the repository
git clone git@github.com:ibproduct/ib-mcp-cache-server.git
cd ib-mcp-cache-server
- Install dependencies
npm install
- Build the project
npm run build
- Add to your MCP client settings (example)
{
"mcpServers": {
"memory-cache": {
"command": "node",
"args": ["/path/to/ib-mcp-cache-server/build/index.js"]
}
}
}
- Start using the server – it will automatically start when you run your MCP client.
Additional notes
Tips and notes:
- You can override default cache settings with environment variables in your MCP client configuration. Supported vars include MAX_ENTRIES, MAX_MEMORY, DEFAULT_TTL, CHECK_INTERVAL, and STATS_INTERVAL.
- You can point the server to a custom config.json by setting CONFIG_PATH in the env; if not provided, the server uses built-in defaults.
- Common issues: ensure the build/index.js path is correct after building, and confirm the MCP client is configured to route relevant requests through memory-cache. If you don’t see a startup message like "Memory Cache MCP server running on stdio", check your client logs for connection errors.
- Monitor cache effectiveness via hit/miss statistics exposed by the server (via logs or metrics, depending on your setup).
- TTL-based eviction will remove older or unused items when memory pressure or entry limits are reached; tuning maxEntries and maxMemory helps prevent memory bloat.
- For large files or data, consider setting appropriate TTLs to balance freshness and cache longevity.
Related MCP Servers
zen
Selfhosted notes app. Single golang binary, notes stored as markdown within SQLite, full-text search, very low resource usage
MCP -Deepseek_R1
A Model Context Protocol (MCP) server implementation connecting Claude Desktop with DeepSeek's language models (R1/V3)
mcp-fhir
A Model Context Protocol implementation for FHIR
mcp
Inkdrop Model Context Protocol Server
mcp-appium-gestures
This is a Model Context Protocol (MCP) server providing resources and tools for Appium mobile gestures using Actions API..
dubco -npm
The (Unofficial) dubco-mcp-server enables AI assistants to manage Dub.co short links via the Model Context Protocol. It provides three MCP tools: create_link for generating new short URLs, update_link for modifying existing links, and delete_link for removing short links.