limps
limps your Local Intelligent MCP Planning Server across AI assistants. No subscriptions, no cloud—run it locally. Version control your planning docs in git. No more context drift—one shared source of truth across Claude, Cursor, Copilot, and any MCP tool.
claude mcp add --transport stdio paulbreuler-limps npx -y @sudosandwich/limps server start \ --env PORT="4269" \ --env LIMITS_LOG="verbosity of limps logs (optional)"
How to use
limps is a local MCP server that provides a planning and document management layer for AI assistants. It runs entirely on your machine and points at any folder you choose, indexing Markdown documents and enabling clients to read, update, or create plans, tasks, and docs via a unified HTTP transport. Once the server is running, clients connect to http://127.0.0.1:4269/mcp and start exchanging MCP payloads. The CLI tools that ship with limps (like creating plans, listing docs, or updating task statuses) operate against the same local index so multiple clients can share a single source of truth. You can configure limps per-project with per-folder configs, and you can drive tooling via the MCP APIs exposed over HTTP for Cursor, Claude, Codex, or any MCP-compatible client.
Key capabilities include: document CRUD with full-text search across your Markdown folder, plan and agent workflows with status tracking and scoring, next-task suggestions with bias tuning, sandboxed document processing helpers, and a knowledge graph to help reason about entities and relationships. Limps also supports extensions and keeps the data on your disk (no cloud by default). To start using it, install the CLI globally, initialize in your target folder, and then start the HTTP daemon. Clients can then print a client config snippet and connect to the limps MCP endpoint.
How to install
Prerequisites:
- Node.js (LTS version) and npm installed on your system
- Access to install global npm packages
Step-by-step installation:
-
Install the limps CLI globally npm install -g @sudosandwich/limps
-
Initialize limps in your documents folder (choose or create a folder to index) mkdir -p ~/Documents/my-planning-docs cd ~/Documents/my-planning-docs limps init
-
Start the HTTP MCP daemon limps server start
-
If needed, print the client config for a MCP client (e.g., Claude) and paste into your client config limps config print --client claude-code
Notes:
- The server runs HTTP on port 4269 by default. You can override with an environment variable or a config option if supported by your setup.
- The limps CLI is used for both server management and MCP tooling; you do not need any separate server script.
Additional notes
Tips and common considerations:
- No git repo required: limps indexes any folder with Markdown files, not just those in a repository.
- If you change the indexed folder or configuration, restart the MCP server to refresh the index and tools.
- The HTTP transport is the recommended method for MCP clients. If you upgrade from an older stdio-based setup, update client configs to use the HTTP transport snippet generated by limps config print.
- The server and tools are designed for local-first usage. Limps emphasizes keeping data on disk and avoiding external network calls unless you opt into version checks or other optional features.
- For troubleshooting, check the limps server status and logs, ensure the port 4269 is not blocked by a firewall, and verify that the client config uses the correct HTTP URL: http://127.0.0.1:4269/mcp
Related MCP Servers
Wax
Sub-Millisecond RAG on Apple Silicon. No Server. No API. One File. Pure Swift
ai-trader
Backtrader-powered backtesting framework for algorithmic trading, featuring 20+ strategies, multi-market support, CLI tools, and an integrated MCP server for professional traders.
mcp
🤖 Taskade MCP · Official MCP server and OpenAPI to MCP codegen. Build AI agent tools from any OpenAPI API and connect to Claude, Cursor, and more.
snippy
🧩 Build AI-powered MCP Tools with Azure Functions, Durable Agents & Cosmos vector search. Features orchestrated multi-agent workflows using OpenAI.
mcp -js
MCP server that exposes YepCode processes as callable tools for AI platforms. Securely connect AI assistants to your YepCode workflows, APIs, and automations.
rod
Model Context Protocol Server of Rod