SpecLoom
Spec-Driven Development Framework for Human-AI Design & Development using V-Model
claude mcp add --transport stdio kpruntov-specloom npx -y specloom serve
How to use
SpecLoom is an MCP Server and CLI that enforces the V-Model for AI-generated code, acting as a Guardian to prevent hallucinations, ensure traceability, and require that every line of code maps to a documented requirement. It exposes a set of MCP tools designed to work with AI agents to manage context, design, code, and verification across a project. When integrated with agents like Gemini CLI, Claude Desktop, Cursor, Windsurf, or Cline, SpecLoom provides structured Context Bundles (Requirements + Design + Code) and applies a four-eyes review process to maintain enterprise-grade compliance. The available MCP tools include loom_next (project planning prompts), loom_context (specs and code retrieval for a task), loom_validate (quality/consistency checks), and loom_verify (traceability and requirement conformance verification).
How to install
Prerequisites:
- Node.js and npm installed on your system
Step-by-step installation:
- Install the Specloom CLI globally via npm:
npm install -g specloom
- Confirm installation:
specloom --version
- Start the MCP server (as configured in mcp_config):
# If using the provided MCP config, you can start via your MCP orchestrator which will invoke:
# command: npx, args: [-y, specloom, serve]
- If you’re integrating with an agent, add the Specloom MCP server under the mcpServers configuration using the example below (see mcp_config).
Additional notes
Tips and considerations:
- Integration: SpecLoom is designed to work with various AI agents (Gemini CLI, Claude Desktop, Cursor, Windsurf, Cline). Use the loom_* tools to guide agents through context, design, and verification workflows.
- Four-Eyes: Ensure proper identity separation to prevent self-approval of code; configure roles and review processes in your workspace.
- Traceability: SpecLoom stores artifacts as JSON within your repository to maintain a Git-native workflow and audit trails.
- Deployment: If you need to run in containers or CI, adapt the mcp_config to your environment (e.g., docker or uvx/python runners) as needed.
- Troubleshooting: If MCP routing fails, verify that the host process has network access to the agent endpoints and that the specloom package is available in the npx registry.
- Documentation: Refer to the included docs (docs/ manual, architecture, methodology) for deeper integration guidance and best practices.
Related MCP Servers
gemini-cli
An open-source AI agent that brings the power of Gemini directly into your terminal.
context7
Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
mcp -spec-driven-development
Spec-Driven Development MCP Server, not just Vibe Coding
pluggedin-app
The Crossroads for AI Data Exchanges. A unified, self-hostable web interface for discovering, configuring, and managing Model Context Protocol (MCP) servers—bringing together AI tools, workspaces, prompts, and logs from multiple MCP sources (Claude, Cursor, etc.) under one roof.
spec-coding
Transform feature ideas into production-ready code through systematic Spec-Driven Development 通过系统化的**规格驱动开发**,将功能想法转化为可投入生产的代码
mcp -arangodb
This is a TypeScript-based MCP server that provides database interaction capabilities through ArangoDB. It implements core database operations and allows seamless integration with ArangoDB through MCP tools. You can use it wih Claude app and also extension for VSCode that works with mcp like Cline!