claude-prompts
MCP prompt template server: hot-reload, thinking frameworks, quality gates
claude mcp add --transport stdio minipuft-claude-prompts npx -y claude-prompts@latest
How to use
Claude Prompts MCP Server provides hot-reloadable, modular prompt tooling for Claude environments. It enables you to manage prompts, chains, gates, and structured thinking patterns (like CAGEERF and ReACT-style workflows) with MCP-aware resources. The server can be consumed by various MCP clients (Claude Desktop, Cursor, Windsurf, Zed, and other MCP clients) and supports integration with custom resources directories via MCP_RESOURCES_PATH. By loading prompts and chains through the MCP server, you can iterate rapidly, test updated prompts in real time, and apply gate reviews to ensure quality before execution. The server also supports chain orchestration with explicit steps and gates between steps, so you can craft complex multi-step tasks with robust gating and evaluation behavior.
To use it, configure your MCP client to point at the claude-prompts server (commonly via npx claude-prompts@latest or through your client’s MCP configuration). You can leverage OpenCode hooks, Gemini resources, and Claude Desktop integration to test prompts live and push fixes back into the running MCP server. If you prefer local development, you can run the server from source, adjust prompts and hooks, rebuild, and have the changes immediately reflected in Claude’s workflow via the MCP transport interface.
How to install
Prerequisites:
- Node.js and npm installed on your system
- Basic familiarity with MCP configuration and how to point clients to an MCP server
Installation steps (stable usage):
-
Prerequisites check
- Ensure Node.js (>= 14) and npm are installed
- Optional: access to the internet for fetching npm packages
-
Run the MCP server via NPX
- This pulls the latest claude-prompts package and starts the MCP server compatible process
# Start the Claude Prompts MCP server (no local install required) npx -y claude-prompts@latest -
(Optional) Local development workflow
- Clone the repository and install dependencies
git clone https://github.com/minipuft/claude-prompts.git cd claude-prompts/server npm install npm run build # Run the built server (example) node dist/index.js --transport=stdio -
Point your MCP clients to the server
- In your MCP client configuration, specify the server as:
{ "mcpServers": { "claude-prompts": { "command": "npx", "args": ["-y", "claude-prompts@latest"] } } } -
Optional: customize resource paths via MCP_RESOURCES_PATH
- You can provide a custom resources directory (prompts, gates, methodologies, styles) by setting the environment variable MCP_RESOURCES_PATH in your MCP config:
{ "mcpServers": { "claude-prompts": { "command": "npx", "args": ["-y", "claude-prompts@latest"], "env": { "MCP_RESOURCES_PATH": "/path/to/your/resources" } } } }
Additional notes
Tips and considerations:
- MCP_RESOURCES_PATH is recommended for centralized, token-efficient access to prompts, gates, and methodologies. You can also customize MCP_PROMPTS_PATH, MCP_GATES_PATH, and MCP_METHODOLOGIES_PATH for targeted overrides.
- When using npx, paths resolve relative to the npm cache. For stability, prefer absolute paths for MCP_RESOURCES_PATH.
- The server supports multiple transport options (default stdio, streamable-http); specify --transport when running from source if needed.
- Claude Desktop users can leverage an NPX-based configuration or the self-contained .mcpb bundle for easy deployment. Other MCP clients (Cursor, Windsurf, Zed, etc.) can be configured similarly via their mcp.json/MCP config paths.
- Development hooks can automatically fix common issues (e.g., missing chain steps, gating results) and help keep prompts aligned with the intended MCP workflow.
Related MCP Servers
neurolink
Universal AI Development Platform with MCP server integration, multi-provider support, and professional CLI. Build, test, and deploy AI applications with multiple ai providers.
mobile
A Model Context Protocol (MCP) server that provides mobile automation capabilities.
mcp-agent
Lightweight, focused utilities to manage connections and execute MCP tools with minimal integration effort. Use it to directly call tools or build simple agents within your current architecture.
memory
A MCP (Model Context Protocol) server providing long-term memory for LLMs
obsidian
MCP server for git-backed Obsidian vaults. Access and manage notes through Claude, ChatGPT, and other LLMs with automatic git sync. Supports local (stdio/HTTP) and remote (AWS Lambda) deployment.
Email MCP server with full IMAP + SMTP support — read, search, send, manage, and organize email from any AI assistant via the Model Context Protocol