context-optimizer
A Model Context Protocol (MCP) server that provides context optimization tools for AI coding assistants including GitHub Copilot, Cursor AI, Claude Desktop, and other MCP-compatible assistants enabling them to extract targeted information rather than processing large terminal outputs and files wasting their context.
claude mcp add --transport stdio malaksedarous-context-optimizer-mcp-server npx -y context-optimizer-mcp-server \ --env CONTEXT_OPT_EXA_KEY="Exa.ai API key" \ --env CONTEXT_OPT_GEMINI_KEY="Gemini API key" \ --env CONTEXT_OPT_LLM_PROVIDER="LLM provider (e.g., gemini, claude, openai)" \ --env CONTEXT_OPT_ALLOWED_PATHS="Paths allowed for analysis (e.g., /path/to/projects)"
How to use
This MCP server, context-optimizer, provides specialized tools to help AI assistants extract targeted information without overwhelming the chat context. It exposes tools for file analysis, terminal command extraction, follow-up questions, and focused web research. You can use the tools to check for specific functions or imports in files, run commands and extract concise results, continue discussions about prior terminal outputs, and perform quick or deep web research to inform coding decisions. When integrated with an MCP client, the server can be invoked via a simple server label (e.g., context-optimizer) and its tools are available to the assistant within structured tool calls. To get started, install the server globally, configure environment keys for your LLM provider, and reference the server in your MCP client configuration as shown in the Quick Start section of the README.
How to install
Prerequisites:
- Node.js v18 or newer
- npm (bundled with Node.js)
Installation steps:
-
Install the MCP server globally: npm install -g context-optimizer-mcp-server
-
Set up environment variables (examples): export CONTEXT_OPT_LLM_PROVIDER="gemini" export CONTEXT_OPT_GEMINI_KEY="your-gemini-api-key" export CONTEXT_OPT_EXA_KEY="your-exa-api-key" export CONTEXT_OPT_ALLOWED_PATHS="/path/to/your/projects"
-
Run the server (via MCP client configuration as shown in the README): npx -y context-optimizer-mcp-server
-
Connect from your MCP client by referencing the server label (e.g., in claude_desktop_config.json or mcp.json) as: "context-optimizer": { "command": "context-optimizer-mcp" }
Note: If you prefer not to use npx, you can adapt the mcp_config to run a direct binary/entrypoint once installed by your package manager or via the npm global bin path.
Additional notes
Tips and common configurations:
- Ensure your LLM provider keys are kept secure and not exposed in logs or shared configurations.
- The server relies on environment variable configuration; only environment variables are used for setup (no config files required).
- When testing, you can run npm test as described in the README to verify functionality, including LLM integration if you provide API keys.
- For security, the server includes path validation, command filtering, and session management. Review and adjust allowed paths and command policies according to your environment.
- If you encounter issues with CLI availability, confirm the global npm bin path is in your system PATH and that the installation completed successfully.
Related MCP Servers
systemprompt-code-orchestrator
MCP server for orchestrating AI coding agents (Claude Code CLI & Gemini CLI). Features task management, process execution, Git integration, and dynamic resource discovery. Full TypeScript implementation with Docker support and Cloudflare Tunnel integration.
mcp-memory-keeper
MCP server for persistent context management in AI coding assistants
sharedcontext
MCP server that gives AI coding assistants persistent cross-client memory. Facts and conversations stored in SQLite, encrypted with AES-256-GCM, synced to Arweave. No server, no account, recoverable with a 12-word phrase.
cco
Real-time audit and approval system for Claude Code tool calls.
mcp-install-instructions-generator
Generate MCP Server Installation Instructions for Cursor, Visual Studio Code, Claude Code, Claude Desktop, Windsurf, ChatGPT, Gemini CLI and more
wormhole
Wormhole: Collaborative AI Workflow Manager🌀