Get the FREE Ultimate OpenClaw Setup Guide →

slimcontext

MCP Server for SlimContext - AI chat history compression tools

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio agentailor-slimcontext-mcp-server npx -y slimcontext-mcp-server

How to use

SlimContext MCP Server provides two compression tools for MCP clients: trim_messages and summarize_messages. Both tools operate on a SlimContext-based approach to compress chat history while preserving system messages and the most recent user-assistant turns. The trim_messages tool uses a token-based threshold to remove the oldest non-system messages when the conversation exceeds the configured token limit, delivering a faster, deterministic compression without external API calls. The summarize_messages tool leverages OpenAI to generate concise summaries of the middle portion of the conversation, preserving system messages and the most recent exchanges, which can significantly reduce context size for long dialogues while maintaining relevant context. To use these tools, configure your MCP client to target the slimcontext MCP server and invoke the desired tool with the appropriate parameters (messages, thresholds, and optional OpenAI model). If you provide an OpenAI API key, summarize_messages can perform richer, AI-generated summaries; otherwise you can supply the key via environment or tool parameters depending on your client setup.

How to install

Prerequisites:

  • Node.js (recommended LTS) installed on your machine
  • npm or pnpm installed

Installation steps:

  1. Install the SlimContext MCP Server globally (or locally in your project): npm install -g slimcontext-mcp-server

    or using pnpm

    pnpm add -g slimcontext-mcp-server

  2. Verify installation: slimcontext-mcp-server --version

  3. Run or develop locally as needed (see Development section in the repository's README for development commands): After cloning the repository, install dependencies and run the server in development mode using your preferred package manager.

Prerequisites recap:

  • Node.js environment
  • Access to npm or pnpm
  • Optional: OpenAI API key if you plan to use summarize_messages

Additional notes

Environment variables and configuration tips:

  • OPENAI_API_KEY: If provided, enables summarize_messages to generate AI-based summaries. If omitted or invalid, summarize_messages may fail or fall back to a non-AI approach depending on client implementation.
  • When using the MCP client configuration, you can point to the slimcontext MCP server via npx as shown in the example. Ensure network access to run npx and access the SlimContext MCP server package.
  • Tokens and model parameters (maxModelTokens, thresholdPercent, minRecentMessages) have sensible defaults in the tool definitions; adjust them based on your conversation length and desired compression aggressiveness.
  • The trim_messages tool preserves all system messages and the most recent messages, then trims the oldest non-system messages to stay under the token threshold.
  • The summarize_messages tool preserves system messages and the most recent messages, then generates a middle-section summary with OpenAI for longer contexts.
  • If you encounter OpenAI rate limits or API errors, check your API key, model availability, and quota. Consider lowering maxModelTokens or compression thresholds to stay within limits.

Related MCP Servers

Sponsor this space

Reach thousands of developers