context-harness
Local-first context ingestion and retrieval for AI tools. SQLite + embeddings + MCP server for Cursor & Claude.
claude mcp add --transport stdio parallax-labs-context-harness ctx serve mcp
How to use
Context Harness exposes an MCP-compatible HTTP server that lets tools like Cursor and Claude query a local, searchable context store built from your connectors (filesystems, Git repos, S3, Lua scripts, etc.). Run the MCP server via the provided CLI command, which serves the HTTP endpoint at a local address and port (by default http://127.0.0.1:7331/mcp). The MCP API accepts JSON-RPC style requests to fetch context, run searches, and retrieve documents. You can also use the REST-style endpoints under /tools for search and retrieval. To integrate with Cursor, add your server entry to .cursor/mcp.json with the context-harness server name and URL, so your AI tools can query your local data store as part of their toolset.
How to install
Prerequisites:
- A supported platform (Linux, macOS, Windows) with a compatible shell.
- Optional: Rust toolchain if you plan to build from source.
- Internet access to download binaries or dependencies.
Install from releases (recommended):
- Download the latest release binary for your OS from the project releases page.
- Place the binary in your PATH, e.g.: curl -L <release-url> | tar xz sudo mv ctx /usr/local/bin/
- Verify installation: ctx --version
Install from source (advanced):
- Ensure Rust and Cargo are installed.
- From a clone of the repository, install the binary: cargo install --path crates/context-harness
- Verify installation: ctx --version
Run the MCP server:
- Start the server (example): ctx serve mcp
- Then configure your client (e.g., Cursor) to point to http://127.0.0.1:7331/mcp by adding: { "mcpServers": { "context-harness": { "url": "http://127.0.0.1:7331/mcp" } } }
Notes:
- If you customize the port or host, ensure your clients and .cursor/mcp.json reflect the new URL.
- You can run in a development/sandbox mode or point to a production-like setup as needed.
Additional notes
Tips & notes:
- The MCP server exposes context via HTTP; typical workflows involve building an index via ctx init, ctx sync, and then starting the MCP server with ctx serve mcp.
- Ensure your embeddings pipeline (local embeddings, Ollama, or OpenAI) is configured in your ctx.toml to enable semantic retrieval where needed.
- If you encounter networking or CORS issues, verify that the MCP server host/port are accessible from your client environment and that firewalls allow traffic to the configured port.
- For production deployments, consider running behind a reverse proxy and using TLS; adjust the server URL in your client configurations accordingly.
- The default API surface includes /mcp (JSON-RPC), /tools/search (REST), and /tools/get (REST); familiarize yourself with these endpoints for integration with tools like Cursor and Claude.
Related MCP Servers
mcp-pinecone
Model Context Protocol server to allow for reading and writing from Pinecone. Rudimentary RAG
Pare
Dev tools, optimized for agents. Structured, token-efficient MCP servers for git, test runners, npm, Docker, and more.
omega-memory
Persistent memory for AI coding agents
knowledgegraph
MCP server for enabling persistent knowledge storage for Claude through a knowledge graph with multiple storage backends and fuzzy search
cursor-feedback-extension
Save your Cursor monthly quota! Unlimited AI interactions in one conversation via MCP feedback loop.
shodan
Shodan MCP server for Claude, Cursor & VS Code. 20 tools for passive reconnaissance, CVE/CPE intelligence, DNS analysis, and device search. 4 tools work free without an API key. OSINT and vulnerability research from your IDE.