Get the FREE Ultimate OpenClaw Setup Guide →

contextplus

Semantic Intelligence for Large-Scale Engineering. Context+ is an MCP server designed for developers who demand 99% accuracy. By combining Tree-sitter AST parsing, Spectral Clustering, and Obsidian-style linking, Context+ turns a massive codebase into a searchable, hierarchical feature graph.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio forloopcodes-contextplus bunx contextplus \
  --env OLLAMA_API_KEY="YOUR_OLLAMA_API_KEY" \
  --env OLLAMA_CHAT_MODEL="gemma2:27b" \
  --env OLLAMA_EMBED_MODEL="nomic-embed-text"

How to use

Context+ is an MCP server that provides semantic code understanding and navigation capabilities over large codebases. It combines structural analysis (Tree-sitter AST), semantic search, memory graphs, and Obsidian-style linking to produce a searchable, hierarchical feature graph of your code. The server exposes a suite of tools organized into Discovery, Analysis, Code Ops, Version Control, and Memory & RAG categories. Typical usage is to run the server via a command like bunx contextplus (or npx -y contextplus) and then interact with it through the MCP client by issuing the available tool commands. This enables you to discover code structure, perform semantic searches, trace symbol usage, run static analyses, and manage a linked graph of concepts and code artifacts. Tools such as get_context_tree, semantic_code_search, and create_relation empower you to explore meaning rather than exact strings, while propose_commit and get_feature_hub support safe code changes and navigation hubs, respectively. You can also use memory graph operations to build richer connections between concepts, files, and symbols, enhancing long-term code comprehension across large projects.

How to install

Prerequisites:

  • Node.js and npm installed on your system.
  • Access to Bunx (or compatible MCP runtime) as indicated by the Context+ setup.

Installation steps:

  1. Ensure dependencies are installed: npm install

  2. Build the project (if building from source): npm run build

  3. Start the MCP server in your project directory using the recommended runtime: bunx contextplus

  4. If you prefer to initialize a config file for a specific IDE, use the provided init commands: npx -y contextplus init claude bunx contextplus init cursor npx -y contextplus init opencode

  5. Add your config to your IDE or MCP runtime as described in the README (e.g., .mcp.json for Claude Code, .vscode/mcp.json for VS Code, etc.).

Additional notes

Tips and common issues:

  • Ensure your Ollama embeddings service is running and accessible if you rely on Ollama-based embedding models.
  • Set OLLAMA_API_KEY securely in your environment and never commit it to source control.
  • If you see environment-related errors, verify that Bunx/Node runtime can access the required environment variables and that the contextplus binary/module is correctly installed.
  • The MCP config supports multiple targets (claude, cursor, vscode, windsurf, opencode). Use the init command corresponding to your IDE to generate the correct config file in the expected location.
  • For large codebases, the semantic navigation features rely on embeddings and clustering; ensure your hardware (RAM/CPU) is adequate for the dataset size you’re indexing.
  • When using memory graph features, periodically prune stale links to keep the graph responsive and relevant.
  • If you switch IDEs, reuse the same underlying MCP server by pointing the different IDEs to the same mcp.json or by following the specific per-IDE config instructions.

Related MCP Servers

Sponsor this space

Reach thousands of developers