prompt-decorators
A standardized framework for enhancing how LLMs process and respond to prompts through composable decorators, featuring an official open standard specification and Python reference implementation with MCP server integration.
claude mcp add --transport stdio synaptiai-prompt-decorators uvx prompt-decorators
How to use
Prompt Decorators provides a Python-based framework and MCP server that lets you apply a rich layer of prompt decorations to LLM interactions. The MCP server exposes decorator discovery, parameter validation, and runtime application so tools like Claude Desktop or other MCP-enabled clients can request decorated prompts and receive adjusted outputs. You can leverage the built-in registry of decorators, dynamic loading from definition files, and automatic documentation generation to compose prompts with concise, declarative annotations instead of lengthy, model-specific instructions. This makes prompt engineering more modular, reusable, and portable across models and platforms. To interact with the MCP server, point your client to the server’s MCP endpoint and send requests that specify the desired decorator (and its parameters) to modify the prompt before it’s sent to the LLM, then return the model’s response with the decorator-induced transformations applied.
How to install
Prerequisites:
- Python 3.11 or newer
- Internet access to install packages
Step 1: Set up a Python virtual environment (optional but recommended)
- python3 -m venv env
- source env/bin/activate (Linux/macOS) or .\env\Scripts\activate (Windows)
Step 2: Install the Prompt Decorators package from PyPI
- python -m pip install --upgrade pip
- pip install prompt-decorators
Step 3: Run the MCP server (using UVicorn-based entry point via uvx)
- uvx prompt-decorators
Step 4: Verify the server is running
- Access the MCP endpoint as documented in the project docs (default host/port can vary; typically http://localhost:8000 or as configured).
Notes:
- If you need to run with a specific port or host, consult the package documentation or use environment variables to configure the MCP server parameters.
- Consider installing additional runtime dependencies if you plan to load custom decorator definitions from files.
Additional notes
Tips and common considerations:
- The MCP server relies on a registry of decorators; you can add or override decorators by providing decorator definition files and enabling dynamic loading.
- Ensure Python 3.11 compatibility, as indicated by the project’s badges and requirements.
- When deploying, consider containerizing the server (e.g., Docker) if you need isolated environments or multi-service orchestration.
- If you encounter import or runtime errors, check that the Python environment has access to all dependencies and that decorator definition files are valid JSON/YAML as required by the registry.
- Environment variables may be used to configure host, port, and logging levels; consult the project documentation for possible vars and defaults.
Related MCP Servers
jupyter
🪐 🔧 Model Context Protocol (MCP) Server for Jupyter.
falcon
Connect AI agents to CrowdStrike Falcon for automated security analysis and threat hunting
beemcp
BeeMCP: an unofficial Model Context Protocol (MCP) server that connects your Bee wearable lifelogger to AI via the Model Context Protocol
MCP-MultiServer-Interoperable-Agent2Agent-LangGraph-AI-System
This project demonstrates a decoupled real-time agent architecture that connects LangGraph agents to remote tools served by custom MCP (Modular Command Protocol) servers. The architecture enables a flexible and scalable multi-agent system where each tool can be hosted independently (via SSE or STDIO), offering modularity and cloud-deployable execut
Helios
An AI IDE secure coding MCP service
mcp-raganything
API/MCP wrapper for RagAnything