Get the FREE Ultimate OpenClaw Setup Guide →

mcp-prompt-engine

MCP Prompt Engine

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio vasayxtx-mcp-prompt-engine docker run -i ghcr.io/vasayxtx/mcp-prompt-engine

How to use

The MCP Prompt Engine serves as a Go-based MCP server that loads prompts written in Go text/template syntax and exposes them to MCP clients. It supports reusable partials, prompt arguments exposed automatically for dynamic input, and JSON argument parsing for booleans, numbers, arrays, and objects. After starting the server (via Docker as shown in installation), you can use the CLI and prompts directory to define and test templates. Use the built-in validation and render capabilities to verify syntax and preview how prompts will look when provided with arguments, then connect your MCP client to render responses for end users.

Within the CLI, you can list available prompts, render a chosen template with arguments, and validate templates to ensure there are no syntax errors. The engine automatically injects environment variables as fallbacks for template arguments, and it supports hot-reloading of prompts so changes are picked up without restarting the server. This makes it suitable for local development and integration with clients like Claude Code, Gemini CLI, or VSCode-based Copilot integrations that consume MCP prompts.

How to install

Prerequisites:

  • Docker installed on your machine (or build/run alternatives if you prefer not to use Docker)
  • Optional: Docker compose if you want to orchestrate multiple services

Installation steps (Docker):

  1. Ensure Docker is running.
  2. Pull and run the pre-built image:
# Pull and run the pre-built image from GHCR
docker run -i --rm \
  -v /path/to/your/prompts:/app/prompts:ro \
  -v /path/to/your/logs:/app/logs \
  ghcr.io/vasayxtx/mcp-prompt-engine
  1. Verify the server starts and exposes its MCP interface. If you want to customize prompts, mount your prompts directory into /app/prompts and logs into /app/logs.

Alternative: Build from source (Go) and run locally

  1. Install Go (1.20+ recommended).
  2. Clone the repository:
git clone https://github.com/vasayxtx/mcp-prompt-engine.git
cd mcp-prompt-engine
  1. Build the binary:
make build
  1. Run the binary (adjust paths as needed):
./mcp-prompt-engine

Additional notes

Tips:

  • Docker usage is the recommended path for quick start; it mirrors the deployment approach used in the README examples.
  • When mounting prompts, ensure your templates use the expected directory structure (prompts/ with .tmpl files and optional partials starting with an underscore).
  • If you encounter JSON argument parsing issues, remove --disable-json-args or configure your command accordingly to enable rich types (booleans, numbers, arrays, objects).
  • Use the validate and list features of the CLI to quickly iterate on templates: list for a summary, and validate <template> to catch syntax errors.
  • For production deployments, consider persisting logs to a host directory and setting environment variables for default argument fallbacks if needed.

Related MCP Servers

Sponsor this space

Reach thousands of developers