Get the FREE Ultimate OpenClaw Setup Guide →

context-harness

Local-first context ingestion and retrieval for AI tools. SQLite + embeddings + MCP server for Cursor & Claude.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio parallax-labs-context-harness ctx serve mcp

How to use

Context Harness exposes an MCP-compatible HTTP server that lets tools like Cursor and Claude query a local, searchable context store built from your connectors (filesystems, Git repos, S3, Lua scripts, etc.). Run the MCP server via the provided CLI command, which serves the HTTP endpoint at a local address and port (by default http://127.0.0.1:7331/mcp). The MCP API accepts JSON-RPC style requests to fetch context, run searches, and retrieve documents. You can also use the REST-style endpoints under /tools for search and retrieval. To integrate with Cursor, add your server entry to .cursor/mcp.json with the context-harness server name and URL, so your AI tools can query your local data store as part of their toolset.

How to install

Prerequisites:

  • A supported platform (Linux, macOS, Windows) with a compatible shell.
  • Optional: Rust toolchain if you plan to build from source.
  • Internet access to download binaries or dependencies.

Install from releases (recommended):

  1. Download the latest release binary for your OS from the project releases page.
  2. Place the binary in your PATH, e.g.: curl -L <release-url> | tar xz sudo mv ctx /usr/local/bin/
  3. Verify installation: ctx --version

Install from source (advanced):

  1. Ensure Rust and Cargo are installed.
  2. From a clone of the repository, install the binary: cargo install --path crates/context-harness
  3. Verify installation: ctx --version

Run the MCP server:

Notes:

  • If you customize the port or host, ensure your clients and .cursor/mcp.json reflect the new URL.
  • You can run in a development/sandbox mode or point to a production-like setup as needed.

Additional notes

Tips & notes:

  • The MCP server exposes context via HTTP; typical workflows involve building an index via ctx init, ctx sync, and then starting the MCP server with ctx serve mcp.
  • Ensure your embeddings pipeline (local embeddings, Ollama, or OpenAI) is configured in your ctx.toml to enable semantic retrieval where needed.
  • If you encounter networking or CORS issues, verify that the MCP server host/port are accessible from your client environment and that firewalls allow traffic to the configured port.
  • For production deployments, consider running behind a reverse proxy and using TLS; adjust the server URL in your client configurations accordingly.
  • The default API surface includes /mcp (JSON-RPC), /tools/search (REST), and /tools/get (REST); familiarize yourself with these endpoints for integration with tools like Cursor and Claude.

Related MCP Servers

Sponsor this space

Reach thousands of developers