gistpad
π An MCP server for managing your personal knowledge, daily notes, and re-usable prompts via GitHub Gists
claude mcp add --transport stdio lostintangent-gistpad-mcp npx -y gistpad-mcp \ --env GITHUB_TOKEN="<YOUR_GITHUB_TOKEN_WITH_GIST_SCOPE>"
How to use
GistPad MCP provides a server that integrates your GitHub Gists into MCP-enabled clients. It exposes a suite of tools to manage gists, their files, comments, and associated resources, along with optional prompts and daily notes. With this MCP, you can list, create, update, or delete gists, modify gist files and descriptions, comment on gists, and fetch summaries or specific content. It also exposes your gists as resources via the gist:/// URI scheme, enabling clients to interact with gists without executing tools. If you enable flags like --archived, --starred, or --daily, your resources will include archived or starred gists and daily notes, respectively, and the CLI will provide corresponding tools.
To use it, configure the MCP client to point at the gistpad-mcp CLI (as shown in the example config). Once running, you can issue commands to list gists, get contents, create or update files, add comments, and manage prompts. For example, you can ask your MCP client to list all gists, create a new gist with a description and initial files, or refresh the gist list after external edits so your client stays in sync.
How to install
Prerequisites:
- Node.js and npm installed on your system (Node.js 14+ recommended).
- A GitHub personal access token with the gist scope, stored as an environment variable (GITHUB_TOKEN) when running the server.
-
Install or verify Node.js and npm:
- Visit https://nodejs.org and install the recommended LTS version.
- Verify with: node -v and npm -v
-
Run the MCP server using npx (no global install required):
- Create a config.json (or the MCP config file your environment expects) containing: { "mcpServers": { "gistpad": { "command": "npx", "args": ["-y", "gistpad-mcp"], "env": { "GITHUB_TOKEN": "<YOUR_PAT>" } } } }
- Start the server with the appropriate runner in your environment (the npx command will fetch the gistpad-mcp package and start the MCP server).
-
Alternative: install globally and run directly
- npm install -g gistpad-mcp
- Run with your configured MCP env, using the same gistpad-mcp command and environment variables.
-
Verify operation
- Ensure the server starts and exposes the configured endpoints to your MCP clients.
- If you modify gists outside of MCP, use the refresh_gists tool to refresh the in-client cache.
Additional notes
Tips and common considerations:
- Environment variable: GITHUB_TOKEN must have gist scope for full access; never expose it in client-side code.
- The server caches gist lists and refreshes on changes or roughly hourly; use the refresh_gists tool if external edits occur and you donβt see updates yet.
- You can enable optional capabilities by including flags in your CLI invocation (e.g., --archived, --starred, --daily, --prompts, --markdown) to expose additional tools and resources.
- Resources are exposed via gist:/// URIs, and you can subscribe to changes if your MCP client supports resource subscriptions.
- For prompts, prompts are stored in a gist called π¬ Prompts and can include placeholders like {{argument}} for dynamic input.
Related MCP Servers
context7
Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
obsidian -tools
Add Obsidian integrations like semantic search and custom Templater prompts to Claude or any MCP client.
MiniMax -JS
Official MiniMax Model Context Protocol (MCP) JavaScript implementation that provides seamless integration with MiniMax's powerful AI capabilities including image generation, video generation, text-to-speech, and voice cloning APIs.
mcp-bundler
Is the MCP configuration too complicated? You can easily share your own simplified setup!
akyn-sdk
Turn any data source into an MCP server in 5 minutes. Build AI-agents-ready knowledge bases.
promptboard
The Shared Whiteboard for Your AI Agents via MCP. Paste screenshots, mark them up, and share with AI.