context42
Your coding standards, always in context. An open-source MCP server with semantic search over your docs. 100% offline.
claude mcp add --transport stdio context42-io-context42 uvx context42-io
How to use
Context42 is an MCP server that lets your AI assistant access your own coding standards and documentation locally. It indexes your sources, stores vector embeddings locally, and exposes a search tool via MCP so you can query your docs with semantic relevance and priority-based weighting. After adding sources and indexing, you start the MCP server and use the included search tool from your client to retrieve relevant content chunks for your prompts. The server runs entirely offline, ensuring your data remains on your machine. The MCP tool exposed by Context42 is a single search endpoint that returns results with the text, source, file path, similarity score, and a priority flag to help you balance personal notes against reference materials.
Usage typically proceeds in three steps: add your documentation sources, index them, and then serve the MCP endpoint you can query through your AI assistant or application. The quick-start commands mirror common workflows: add a local folder containing your standards, index the content to generate searchable chunks, and then run the server to start serving searches via MCP.
How to install
Prerequisites:
- Python 3.11 or newer
- Access to install Python packages (pip, pipx, or uv)
Installation options:
- Using pipx (recommended):
pipx install context42-io
- Using uv (standalone Python virtual environment manager):
uvx context42-io
- Using pip (local installation):
pip install context42-io
After installation, you can start using the CLI (examples assume the default executable name provided by the package):
c42 add <path-to-docs> --name <source-name> [--priority <value>]
c42 index
c42 serve
Prerequisite notes:
- Ensure Python 3.11+ is installed.
- The first run may download model files for embeddings; ensure you have sufficient disk space and, if behind a proxy, configure your environment accordingly.
Additional notes
Tips and notes:
- Context42 stores data locally (embeddings, indexed chunks) using LanceDB; no data leaves your machine.
- You can adjust embedding models via environment variable C42_EMBEDDING_MODEL (default is BAAI/bge-small-en-v1.5). Changing the model requires re-indexing all sources.
- Data storage locations vary by OS (see the README under Data Storage) and can be customized by setting the C42_DATA_DIR env var if needed.
- For Claude Desktop integration, you can configure the context42 MCP server with the provided example by setting command to c42 and args to serve; restart Claude Desktop after changes.
- The MCP tool exposed by Context42 supports search(query: string, top_k?: int) -> SearchResult[]; each result includes text, source, file, score, priority, and is_priority.
- If you run into issues, common checks include ensuring the sources are accessible, re-indexing after changing embedding models, and verifying that the server is actively serving via the CLI.
Related MCP Servers
MCP-Bridge
A middleware to provide an openAI compatible endpoint that can call MCP tools
mcp-google-ads
An MCP tool that connects Google Ads with Claude AI/Cursor and others, allowing you to analyze your advertising data through natural language conversations. This integration gives you access to campaign information, performance metrics, keyword analytics, and ad management—all through simple chat with Claude, Cursor or Windsurf.
mcp-rquest
A MCP server providing realistic browser-like HTTP request capabilities with accurate TLS/JA3/JA4 fingerprints for bypassing anti-bot measures. It also supports converting PDF and HTML documents to Markdown for easier processing by LLMs.
asterisk
Asterisk Model Context Protocol (MCP) server.
classmcp
MCP server for AI-assisted CSS development. 77% token savings with semantic class patterns. Supports Tailwind, Bootstrap, UnoCSS, Tachyons.
rednote-analyzer
MCP server that lets AI assistants search, analyze, and generate Xiaohongshu (小红书) content with real-time data via browser automation