context-lens
Semantic search knowledge base for MCP-enabled AI assistants. Index local files or GitHub repos, query with natural language. Built on LanceDB vector storage. Works with Claude Desktop, Cursor, and other MCP clients.
claude mcp add --transport stdio cornelcroi-context-lens uvx context-lens
How to use
Context Lens provides a self-contained MCP server that builds a local, searchable knowledge base from your content using LanceDB as the embedded vector store. It supports semantic search across code, docs, contracts, and text by converting content into meaningful vector representations and performing concept-based retrieval rather than simple keyword matching. You can interact with it through any MCP client (for example Claude Desktop, Continue.dev, or custom MCP integrations) by pointing the client at the context-lens server using the uvx-based command in your MCP configuration. The server exposes capabilities for indexing content, performing semantic queries, and retrieving relevant chunks that form the basis of answers, all while keeping data local on your machine. The embedding model uses a local pipeline to generate vectors, ensuring privacy and zero external API calls.
To use it, connect via an MCP client with the appropriate server specification. For example, in Kiro IDE or Cursor, you’ll configure the mcpServers entry for context-lens with command uvx and the argument context-lens. Once connected, you can add content to the knowledge base, perform semantic searches, and have the agent retrieve relevant chunks and respond with contextual answers. The system supports asking questions like how a particular authentication flow works across code and docs, and it will surface meaningfully related content rather than just exact word matches.
How to install
Prerequisites:
- Python 3.11 or higher installed on your system
- Access to the command uvx (the MCP execution runner) or the ability to install it via your preferred method
Installation steps:
-
Install Python 3.11+ from https://www.python.org/downloads/
-
Install the MCP runner (uvx) if you don’t already have it. The exact method may vary by environment; a common approach is to install a package that provides the uvx command. For example, you might use pipx or a system package manager as documented by the uvx distribution. Example (adjust to your environment):
- pipx install uvx # or follow the project-specific installation guide for uvx
-
Ensure uvx is available in your PATH
-
Install Context Lens so that the uvx command can discover the context-lens package. This typically means installing the Python package named context-lens from PyPI:
pip install context-lens
-
Start the MCP server using uvx for the context-lens package:
uvx context-lens
-
Verify the server is reachable and listening for MCP clients on the default MCP port used by your setup. You can consult the MCP client documentation for how to point to the context-lens server (e.g., through .mcp.json files used by Kiro, Cursor, or Claude Desktop).
Optional configuration (recommended):
- Create or edit your MCP client configuration (e.g., .mcp.json) to include: { "mcpServers": { "context-lens": { "command": "uvx", "args": ["context-lens"] } } }
- If your setup supports autoApprove actions, you can enable them (e.g., "autoApprove": ["list_documents", "search_documents"]).
Note: The exact installation steps for uvx can vary by environment. If your system provides a packaged uvx or a containerized option, prefer that method and adapt the CLI command accordingly.
Additional notes
Tips and common issues:
- Data locality: Context Lens stores embeddings and the LanceDB database on disk. Ensure the storage path is writable and has enough space for your content and embeddings.
- Model and performance: The embedding model is 384-dimensional and designed for local use. If you experience performance issues, consider adjusting chunk size (default 1000 chars) and overlap (default 200 chars) to suit your content and hardware.
- Content parsing: The system uses language-aware parsers (e.g., AST for Python, structural parsing for JSON, header-based splitting for Markdown). Ensure your content is in a readable format for best results.
- Content indexing workflow: You can index local files or public GitHub repositories. To index a new repository, add its path/URL to the LanceDB-backed knowledge base using your MCP client tools.
- Security and privacy: All data remains local. If you share a machine or repository, consider encrypting the LanceDB store or restricting access to the host.
- Environment variables: If you rely on tokens or environment-based configuration, you can pass these via your MCP client or environment; the default setup shown here uses no required environment variables beyond what your client needs to connect.
- Registry and verification: Context Lens is published under the MCP registry as io.github.cornelcroi/context-lens. See REGISTRY.md for installation verification guidance.
Related MCP Servers
pilot-shell
The professional development environment for Claude Code. Claude Code is powerful. Pilot Shell makes it reliable. Start a task, grab a coffee, come back to production-grade code. Tests enforced. Context preserved. Quality automated.
mcp-nixos
MCP-NixOS - Model Context Protocol Server for NixOS resources
claude-code
MCP Server connects with claude code local command.
local-skills
Universal MCP server enabling any LLM or AI agent to utilize expert skills from your local filesystem. Reduces context consumption through lazy loading. Works with Claude, Cline, and any MCP-compatible client.
context-engineering
🧠 Stop building AI that forgets. Master MCP (Model Context Protocol) with production-ready semantic memory, hybrid RAG, and the WARNERCO Schematica teaching app. FastMCP + LangGraph + Vector/Graph stores. Your AI assistant's long-term memory starts here.
automagik-tools
From API to AI in 30 Seconds - Transform any API into an intelligent MCP agent that learns, adapts, and speaks human