Context-Engine
Context-Engine MCP - Agentic Context Compression Suite
claude mcp add --transport stdio context-engine-ai-context-engine npx -y context-engine-ai-context-engine \ --env CTX_HOST="optional custom host for MCP endpoints" \ --env CTX_API_KEY="optional API key for hosted Context Engine services"
How to use
Context Engine provides AI agent skills that empower MCQ-based coding assistants with semantic code search, memory, and symbol intelligence. The server exposes 30+ MCP tools that enable semantic search, symbol_graph navigation (finding callers, callees, definitions, and import relationships), memory storage and recall across sessions, and cross-repo analysis. Typical workflows include using search as the default tool to route queries to the best backend (semantic search, Q&A, symbol graphs, or diff/history queries), and leveraging memory_store and memory_find to retain context across interactions. The server also supports batch queries (batch_search, batch_symbol_graph, batch_graph_query) for efficiency, cross_repo_search for multi-repo codebases, and pattern_search to discover structural patterns like retry loops or singletons across languages. To use Context Engine, start the MCP server and then invoke the provided MCP tools from your AI assistant or integration layer. The included SKILL files and rules guide your assistant on how to issue the right tool calls for code search, understanding code structure, and maintaining context over time.
How to install
Prerequisites:
- Node.js (LTS version) and npm installed on your machine
- Optional: a hosted Context Engine service or API key if you plan to connect to hosted capabilities
Installation and startup (using npm/npx):
-
Install Node.js and npm from https://nodejs.org/
-
Start the MCP server directly via npx (as defined in mcp_config):
npx -y context-engine-ai-context-engine
This will pull the Context Engine MCP server package and launch the server for MCP tool usage. If prompted for authentication or API keys for hosted features, provide them as environment variables or via your host configuration.
-
If you prefer a local install, you can install the package globally and run it from the command line:
npm install -g context-engine-ai-context-engine context-engine-ai-context-engine
-
Verify the server is listening on the expected MCP endpoint (default typically http://localhost:8000 or as configured by the package).
Note: If the project uses a different startup script or a custom entrypoint, replace the commands above with the specific path to server.js or the appropriate module invocation as described in the package documentation.
Additional notes
Tips and common questions:
- If you operate behind a proxy or corporate network, ensure npm/npx can access the registry and the package. Set HTTP_PROXY/HTTPS_PROXY environment variables if needed.
- The server exposes a rich set of MCP tools (search, symbol_graph, memory_store, memory_find, cross_repo_search, pattern_search, search_commits_for, change_history_for_path, batch_* variants). Use the default search tool to route queries to the most suitable backend automatically.
- For hosted usage, you may need an API key or authentication; keep credentials in environment variables (e.g., CTX_API_KEY) and avoid committing them to source control.
- If you run into memory or performance issues when indexing large codebases, consider using batch queries and cross_repo_search with boundary tracing to optimize token usage and results.
- The repository note indicates the original source code may be unavailable; rely on the MCP bridge and the provided skills to integrate Context Engine with your assistant workflows.
Related MCP Servers
dbhub
Zero-dependency, token-efficient database MCP server for Postgres, MySQL, SQL Server, MariaDB, SQLite.
octocode
MCP server for semantic code research and context generation on real-time using LLM patterns | Search naturally across public & private repos based on your permissions | Transform any accessible codebase/s into AI-optimized knowledge on simple and complex flows | Find real implementations and live docs from anywhere
bytechef
Open-source, AI-native, low-code platform for API orchestration, workflow automation, and AI agent integration across internal systems and SaaS products.
OpenContext
A personal context store for AI agents and assistants—reuse your existing coding agent CLI (Codex/Claude/OpenCode) with built‑in Skills/tools and a desktop GUI to capture, search, and reuse project knowledge across agents and repos.
penpot
Penpot MCP server
smart-coding
An extensible Model Context Protocol (MCP-Local-MRL-RAG-AST) server that provides intelligent semantic code search for AI assistants. Built with local AI models, inspired by Cursor's semantic search.