Get the FREE Ultimate OpenClaw Setup Guide →

context-optimizer

A Model Context Protocol (MCP) server that provides context optimization tools for AI coding assistants including GitHub Copilot, Cursor AI, Claude Desktop, and other MCP-compatible assistants enabling them to extract targeted information rather than processing large terminal outputs and files wasting their context.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio malaksedarous-context-optimizer-mcp-server npx -y context-optimizer-mcp-server \
  --env CONTEXT_OPT_EXA_KEY="Exa.ai API key" \
  --env CONTEXT_OPT_GEMINI_KEY="Gemini API key" \
  --env CONTEXT_OPT_LLM_PROVIDER="LLM provider (e.g., gemini, claude, openai)" \
  --env CONTEXT_OPT_ALLOWED_PATHS="Paths allowed for analysis (e.g., /path/to/projects)"

How to use

This MCP server, context-optimizer, provides specialized tools to help AI assistants extract targeted information without overwhelming the chat context. It exposes tools for file analysis, terminal command extraction, follow-up questions, and focused web research. You can use the tools to check for specific functions or imports in files, run commands and extract concise results, continue discussions about prior terminal outputs, and perform quick or deep web research to inform coding decisions. When integrated with an MCP client, the server can be invoked via a simple server label (e.g., context-optimizer) and its tools are available to the assistant within structured tool calls. To get started, install the server globally, configure environment keys for your LLM provider, and reference the server in your MCP client configuration as shown in the Quick Start section of the README.

How to install

Prerequisites:

  • Node.js v18 or newer
  • npm (bundled with Node.js)

Installation steps:

  1. Install the MCP server globally: npm install -g context-optimizer-mcp-server

  2. Set up environment variables (examples): export CONTEXT_OPT_LLM_PROVIDER="gemini" export CONTEXT_OPT_GEMINI_KEY="your-gemini-api-key" export CONTEXT_OPT_EXA_KEY="your-exa-api-key" export CONTEXT_OPT_ALLOWED_PATHS="/path/to/your/projects"

  3. Run the server (via MCP client configuration as shown in the README): npx -y context-optimizer-mcp-server

  4. Connect from your MCP client by referencing the server label (e.g., in claude_desktop_config.json or mcp.json) as: "context-optimizer": { "command": "context-optimizer-mcp" }

Note: If you prefer not to use npx, you can adapt the mcp_config to run a direct binary/entrypoint once installed by your package manager or via the npm global bin path.

Additional notes

Tips and common configurations:

  • Ensure your LLM provider keys are kept secure and not exposed in logs or shared configurations.
  • The server relies on environment variable configuration; only environment variables are used for setup (no config files required).
  • When testing, you can run npm test as described in the README to verify functionality, including LLM integration if you provide API keys.
  • For security, the server includes path validation, command filtering, and session management. Review and adjust allowed paths and command policies according to your environment.
  • If you encounter issues with CLI availability, confirm the global npm bin path is in your system PATH and that the installation completed successfully.

Related MCP Servers

Sponsor this space

Reach thousands of developers