PRD
Flagship Model Context Protocol server for generating Product Requirement Documents (PRDs) from codebase context.
claude mcp add --transport stdio saml1211-prd-mcp-server npx -y prd-creator-mcp \ --env OPENAI_MODEL="default" \ --env PRD_TEMPLATE="standard" \ --env OPENAI_API_KEY="your-openai-api-key" \ --env ANTHROPIC_MODEL="claude-1.3" \ --env LOCAL_MODEL_PATH="/path/to/local-model" \ --env ANTHROPIC_API_KEY="your-anthropic-api-key" \ --env GOOGLE_GEMINI_API_KEY="your-google-gemini-api-key"
How to use
This MCP server is a specialized tool for creating Product Requirements Documents (PRDs) via the MCP protocol. It exposes a set of capabilities to generate PRDs from product descriptions, user stories, and requirements, using multiple AI providers or template-based generation as a fallback. You can request a complete PRD through the generate_prd tool, validate the resulting document with validate_prd, and manage templates and provider configurations at runtime using the protocol tools exposed by the MCP server. The server supports configuring providers (OpenAI, Google Gemini, Anthropic Claude, or local models), and it can gracefully fall back to templates if AI providers are unavailable. Integrations show how to connect the MCP server to clients like Claude Desktop, Glama.ai, Cursor, Roo Code, and Cline, enabling seamless MCP-based workflows.
How to install
Prerequisites:
- Node.js v16 or higher
- npm or yarn
Install from source:
- Clone the repository:
git clone https://github.com/Saml1211/prd-mcp-server.git
cd prd-mcp-server
- Install dependencies:
npm install
- Build the project:
npm run build
- Run locally:
npm start
- For development with hot reload:
npm run dev
Alternative: Run via NPX (recommended in Quick Start):
npx -y prd-creator-mcp
Docker (optional):
docker pull saml1211/prd-creator-mcp
docker run -i --rm saml1211/prd-creator-mcp
Additional notes
Tips and notes:
- Provide your API keys and model preferences through a .env file (see .env.example in the repo) or via the update_provider_config MCP tool at runtime. The server merges protocol/tool updates with environment variables, giving precedence to runtime changes.
- The default PRD workflow generates PRDs using AI providers, with a template-based fallback when providers are unavailable. You can customize provider options (temperature, maxTokens, etc.) per request.
- If integrating with external MCP clients, use the provided integration snippets for Claude Desktop, Glama.ai, Cursor, Roo Code, and Cline to add your server under a named MCP server entry (e.g., prd-creator).
- Ensure you have the required environment variables set or provide credentials through the MCP tooling to avoid startup failures.
- For troubleshooting, use health_check and get_logs endpoints to diagnose provider availability and recent activity.
Related MCP Servers
utcp-specification
The specification for the Universal Tool Calling Protocol
outline
It's an MCP server... for Outline (the documentation platform!)
CanvasMCPClient
Canvas MCP Client is an open-source, self-hostable dashboard application built around an infinite, zoomable, and pannable canvas. It provides a unified interface for interacting with multiple MCP (Model Context Protocol) servers through a flexible, widget-based system.
company-docs
AI-powered company knowledge MCP. Unified place for internal policies, values, documentation, and governance. Agents can search, cite, and answer questions using real company docs.
docmole
Dig through any documentation with AI - MCP server for Claude, Cursor, and other AI assistants
obsidian
MCP server for Obsidian vault management - enables Claude and other AI assistants to read, write, search, and organize your notes