mcp-prompt-engine
MCP Prompt Engine
claude mcp add --transport stdio vasayxtx-mcp-prompt-engine docker run -i ghcr.io/vasayxtx/mcp-prompt-engine
How to use
The MCP Prompt Engine serves as a Go-based MCP server that loads prompts written in Go text/template syntax and exposes them to MCP clients. It supports reusable partials, prompt arguments exposed automatically for dynamic input, and JSON argument parsing for booleans, numbers, arrays, and objects. After starting the server (via Docker as shown in installation), you can use the CLI and prompts directory to define and test templates. Use the built-in validation and render capabilities to verify syntax and preview how prompts will look when provided with arguments, then connect your MCP client to render responses for end users.
Within the CLI, you can list available prompts, render a chosen template with arguments, and validate templates to ensure there are no syntax errors. The engine automatically injects environment variables as fallbacks for template arguments, and it supports hot-reloading of prompts so changes are picked up without restarting the server. This makes it suitable for local development and integration with clients like Claude Code, Gemini CLI, or VSCode-based Copilot integrations that consume MCP prompts.
How to install
Prerequisites:
- Docker installed on your machine (or build/run alternatives if you prefer not to use Docker)
- Optional: Docker compose if you want to orchestrate multiple services
Installation steps (Docker):
- Ensure Docker is running.
- Pull and run the pre-built image:
# Pull and run the pre-built image from GHCR
docker run -i --rm \
-v /path/to/your/prompts:/app/prompts:ro \
-v /path/to/your/logs:/app/logs \
ghcr.io/vasayxtx/mcp-prompt-engine
- Verify the server starts and exposes its MCP interface. If you want to customize prompts, mount your prompts directory into /app/prompts and logs into /app/logs.
Alternative: Build from source (Go) and run locally
- Install Go (1.20+ recommended).
- Clone the repository:
git clone https://github.com/vasayxtx/mcp-prompt-engine.git
cd mcp-prompt-engine
- Build the binary:
make build
- Run the binary (adjust paths as needed):
./mcp-prompt-engine
Additional notes
Tips:
- Docker usage is the recommended path for quick start; it mirrors the deployment approach used in the README examples.
- When mounting prompts, ensure your templates use the expected directory structure (prompts/ with .tmpl files and optional partials starting with an underscore).
- If you encounter JSON argument parsing issues, remove --disable-json-args or configure your command accordingly to enable rich types (booleans, numbers, arrays, objects).
- Use the validate and list features of the CLI to quickly iterate on templates: list for a summary, and validate <template> to catch syntax errors.
- For production deployments, consider persisting logs to a host directory and setting environment variables for default argument fallbacks if needed.
Related MCP Servers
weather
A lightweight Model Context Protocol (MCP) server that enables AI assistants like Claude to retrieve and interpret real-time weather data. Discuss on Hacker News:
sandbox
A Model Context Protocol (MCP) server that enables LLMs to run ANY code safely in isolated Docker containers.
github-brain
An experimental GitHub MCP server with local database.
mcp-tts
MCP Server for Text to Speech
tasker
An MCP server for Android's Tasker automation app.
kai
An MCP Server for Kubernetes