ntfy
An MCP (Model Context Protocol) server designed to interact with the ntfy push notification service. It enables LLMs and AI agents to send notifications to your devices with extensive customization options.
claude mcp add --transport stdio cyanheads-ntfy-mcp-server node /path/to/ntfy-mcp-server/dist/index.js \ --env NODE_ENV="production" \ --env LOG_LEVEL="info" \ --env NTFY_BASE_URL="https://ntfy.sh" \ --env NTFY_DEFAULT_TOPIC="your_default_topic"
How to use
This MCP server provides a bridge between LLM agents and the ntfy push notification service. By implementing the MCP protocol, it exposes a tool named send_ntfy that allows clients to publish notifications to ntfy topics with rich formatting and features. The server is built with TypeScript and leverages the @modelcontextprotocol/sdk for MCP compatibility, while integrating with ntfy to deliver messages to devices or apps subscribed to a topic. You can configure a default topic and logging behavior via environment variables to tailor how notifications are routed and logged. After connecting an MCP client (such as Claude Desktop, Cline, or another MCP client), you can invoke send_ntfy to craft messages that include prioritization levels, emoji tags, clickable actions, file attachments, delivery delays, and markdown content. The ntfy integration makes it easy to push timely alerts, status updates, or alerts to multiple devices through ntfy’s pub-sub model.
How to install
Prerequisites:
- Node.js v16+ installed on the host
- npm or yarn
- Git (optional for cloning the repository)
Option A: Install via npm (recommended for distribution)
- Install globally: npm install -g ntfy-mcp-server
- Start the server (example): ntfy-mcp-server
Option B: Install from source
- Clone the repository: git clone https://github.com/cyanheads/ntfy-mcp-server.git cd ntfy-mcp-server
- Install dependencies: npm install
- Build the project (if required by the setup): npm run build
- Run the server (example): npm start
Configuration notes:
- Create a .env file based on the provided example and adjust NTFY_BASE_URL, NTFY_DEFAULT_TOPIC, LOG_LEVEL, and NODE_ENV as needed.
Additional notes
Tips and notes:
- The server exposes a default ntfy topic via NTFY_DEFAULT_TOPIC; set this in your .env to ensure notifications have a target if one isn’t specified by the client.
- Adjust LOG_LEVEL to control verbosity (debug, info, warn, error) for troubleshooting.
- If running from source, ensure the dist/index.js path in the MCP configuration points to the compiled entry file after building.
- The ntfy integration supports message prioritization (1-5), emoji tags, clickable actions and buttons, file attachments, delayed delivery, and Markdown formatting. Validate your requests against MCP client capabilities to ensure proper formatting.
- For production deployments, consider hosting ntfy-mcp-server behind a process manager (e.g., systemd, PM2) and enabling appropriate security filters.
Related MCP Servers
obsidian
Obsidian Knowledge-Management MCP (Model Context Protocol) server that enables AI agents and development tools to interact with an Obsidian vault. It provides a comprehensive suite of tools for reading, writing, searching, and managing notes, tags, and frontmatter, acting as a bridge to the Obsidian Local REST API plugin.
Remote
A type-safe solution to remote MCP communication, enabling effortless integration for centralized management of Model Context.
mcp-streamable-http
Example implementation of MCP Streamable HTTP client/server in Python and TypeScript.
mcp-ts-template
TypeScript template for building Model Context Protocol (MCP) servers. Ships with declarative tools/resources, pluggable auth, multi-backend storage, OpenTelemetry observability, and first-class support for both local and edge (Cloudflare Workers) runtimes.
Matryoshka
MCP server for token-efficient large document analysis via the use of REPL state
perplexity
A Perplexity API MCP server that unlocks Perplexity's search-augmented AI capabilities for LLM agents. Features robust error handling, secure input validation, and transparent reasoning with the showThinking parameter.