runi
See the truth about your APIs runi is an open-source API client that verifies what AI generates. When 46% of developers don't trust AI output (Stack Overflow 2025 Developer Survey), you need a second opinion. runi is that opinion.
claude mcp add --transport stdio paulbreuler-runi node dist/server.js \ --env RUNI_ENV="development" \ --env RUNI_PORT="3000"
How to use
runi provides an MCP-backed control surface for interacting with and verifying API integrations. The MCP server component exposes event-driven UI actions, real-time streaming of canvas state via SSE, and AI activity/provenance signals so agent-driven actions remain observable. Use the server to bind requests and AI-generated code to your API specifications, perform drift detection against OpenAPI or other specs, and inspect verifications that confirm AI-generated integrations actually conform to your declared API surface. The project emphasizes a local-first, git-friendly workflow where your data stays on your machine and can be versioned alongside your code.
Once running, you can interact with the MCP surface to trigger UI actions, observe real-time state changes, and review provenance signals generated by AI-assisted actions. Use the drift detection capabilities to compare your requests against bound specs, and leverage the AI verification workflow to paste AI-generated code and have it validated against the bound spec. This helps catch mismatches, deprecated fields, and potential spec drift before deploying to production.
How to install
Prerequisites:
- Git installed
- A supported runtime (Node.js environment assumed for this MCP server example; adapt if you’re using a container or other runtime)
- Optional: Docker if you prefer containerized setup
Install and run (Node-based workflow):
-
Clone the repository git clone https://github.com/paulbreuler/runi.git cd runi
-
Install dependencies npm install
-
Build (if using a build step) npm run build
-
Run the MCP server npm run start
Alternatively, if you prefer Docker:
-
Build the image (example) docker build -t runi:latest .
-
Run the container (example) docker run -it -p 3000:3000 runi:latest
Configure environment variables as needed (see additional notes):
- RUNI_ENV: development|production
- RUNI_PORT: port number for the MCP server
- Other feature flags can be added as needed by editing the environment or config files.
Additional notes
Tips and caveats:
- Since runi emphasizes local-first operation, ensure your configuration and data directories are backed up with your project files.
- If you encounter drift-related alerts, verify the bound spec (OpenAPI or similar) your requests are associated with and refresh bindings as specs evolve.
- The MCP surface supports real-time updates via SSE; ensure your network allows EventSource connections if you’re running behind proxies.
- If you plan to deploy in production, consider securing the MCP endpoints and enabling proper authentication, since MCP surfaces can control UI actions and reflect AI provenance signals.
- Typical environment variables you may encounter include RUNI_ENV, RUNI_PORT, and feature-toggle flags documented in the repo’s config references.
Related MCP Servers
penpot
Penpot MCP server
hayhooks
Easily deploy Haystack pipelines as REST APIs and MCP Tools.
ollama -bridge
Extend the Ollama API with dynamic AI tool integration from multiple MCP (Model Context Protocol) servers. Fully compatible, transparent, and developer-friendly, ideal for building powerful local LLM applications, AI agents, and custom chatbots
hyperterse
The MCP framework. Connect your data to your agents.
hackmd
A Model Context Protocol server for integrating HackMD's note-taking platform with AI assistants.
mcp-konnect
A Model Context Protocol (MCP) server for interacting with Kong Konnect APIs, allowing AI assistants to query and analyze Kong Gateway configurations, traffic, and analytics.