Matryoshka
MCP server for token-efficient large document analysis via the use of REPL state
claude mcp add --transport stdio yogthos-matryoshka lattice-mcp
How to use
Matryoshka includes an MCP server (lattice-mcp) that exposes the Lattice/Nucleus-based document analysis tooling to agents via a handle-based RPC interface. This enables agents to load a document, query it with Nucleus commands, expand handles to view results, and manage the session entirely on the server side. The server supports commands like lattice_load, lattice_query, lattice_expand, lattice_close, lattice_status, lattice_bindings, lattice_reset, and lattice_help. Typical usage is to start the MCP server and connect an agent to issue Nucleus commands and receive compact handle references (e.g., $res1) instead of full data, reducing bandwidth while allowing powerful, symbolic reasoning over large documents.
How to install
Prerequisites:
- Node.js and npm installed on your system
- Optional: Git if you want to clone the repository and build from source
From the npm package (recommended):
- Install the MCP-enabled Matryoshka tooling globally: npm install -g matryoshka-rlm
- Ensure the lattice-mcp binary is available in your PATH. The MCP server is provided as lattice-mcp and will be invoked by the MCP runtime.
- Start the MCP server via your MCP runner configuration (see mcp_config): lattice-mcp
From source (advanced):
- git clone https://github.com/yogthos/Matryoshka.git
- cd Matryoshka
- npm install
- npm run build
- Run the MCP server binary if provided in dist, or run lattice-mcp from the built package depending on your setup.
Configuration:
- Create a config.json for your MCP client that points to the lattice server (see Configuration section in README).
Additional notes
Tips and notes:
- The MCP server returns handle-based references (e.g., $res1) to large results to minimize data transfer. Expand handles server-side with lattice_expand when you need the full data.
- Ensure your document is loaded with lattice_load before querying, and close the session with lattice_close when done to free memory.
- If using providers, configure your LLM provider and options in config.json as shown in the README (ollama or deepseek examples).
- The integration uses Nucleus commands; the LLM never executes arbitrary code on your system. Validation and type checking occur in the Lattice engine before execution.
- If you run into issues, verify that lattice-mcp is accessible in PATH and that any required environment variables for your provider are set (e.g., API keys for DeepSeek).
Related MCP Servers
reddit -buddy
Clean, LLM-optimized Reddit MCP server. Browse posts, search content, analyze users. No fluff, just Reddit data.
Remote
A type-safe solution to remote MCP communication, enabling effortless integration for centralized management of Model Context.
mcp-streamable-http
Example implementation of MCP Streamable HTTP client/server in Python and TypeScript.
oxylabs
Official Oxylabs MCP integration
ToolRAG
Unlimited LLM tools, zero context penalties — ToolRAG serves exactly the LLM tools your user-query demands.
vscode-context
MCP Server to Connect with VS Code IDE