backlog
Help coding agents and developers to keep track of a project's backlog by storing tasks as markdown in git.
claude mcp add --transport stdio veggiemonk-backlog docker run -i ghcr.io/veggiemonk/backlog:latest mcp \ --env BACKLOG_REPO="path/to/your/git-backed-backlog-repo (or mount volume into container)"
How to use
Backlog exposes an MCP server that lets AI agents interact with a markdown-based Git-backed task backlog through a standardized tool interface. The server is designed to store all tasks as readable Markdown files inside a Git repository, providing an offline-first workflow and zero-configuration setup. Agents can use the MCP prompts to create, query, and update tasks, navigate hierarchical task structures, and leverage AI-friendly commands that integrate with the backlog CLI. The included instructions command and the standard MCP pipeline enable agents to plan, generate tasks, and synchronize progress directly in the repository.
To use the server via MCP, run backlog in a container or native binary as described by your chosen deployment method, then connect your MCP-enabled agent to the backlog MCP endpoint. The agent can issue commands to list tasks, create new tasks with metadata, set relationships between tasks, and search using the --query capabilities. The design focuses on AI-friendly task management, allowing agents to reason about tasks, update progress, and maintain consistency with Git commits.
How to install
Prerequisites:
- Git
- Go (for building backlog locally) or Docker (recommended for MCP deployment)
- Git installed and accessible in your environment
Installation options:
Option A: Build from source (Go)
- git clone https://github.com/veggiemonk/backlog
- cd backlog
- go build .
- ./backlog mcp
Option B: Install via Go (binary)
- go install github.com/veggiemonk/backlog@latest
- backlog mcp
Option C: Use Docker (recommended for MCP usage)
- docker pull ghcr.io/veggiemonk/backlog:latest
- docker run -it --rm -v /path/to/your/git-backed-repo:/repo -w /repo ghcr.io/veggiemonk/backlog:latest mcp
Notes:
- For Docker, mount your backlog Git repository into the container so all tasks are stored in Git.
- Ensure you have a repository initialized to back backlog files and that Git is configured (user name/email).
Additional notes
Tips and notes:
- Backlog stores tasks as Markdown files inside a Git repository, enabling full human-readable auditing and easy AI prompting for planning and updates.
- When using MCP, you can leverage the backlog CLI to perform operations outside of MCP as well, providing a flexible workflow for both humans and AI agents.
- If you encounter conflicts in task IDs or Markdown formatting, rely on backlog's conflict resolution features and keep tasks grouped under consistent headings for maintainability.
- Environment variable placeholders can be used to configure repository paths, Git credentials, or agent-specific prompts when running inside containers.
Related MCP Servers
workflowy
Powerful CLI and MCP server for WorkFlowy: reports, search/replace, backup support, and AI integration (Claude, LLMs)
floop
Spreading activation memory for AI coding agents - corrections in, context-aware behaviors out.
HydraMCP
Connect agents to agents. MCP server for querying any LLM through your existing subscriptions: compare, vote, and synthesize across GPT, Gemini, Claude, and local models from one terminal.
shellguard
MCP server that gives LLM agents read-only shell access over SSH
timebound-iam
An MCP Server that sits between your agent and AWS STS and issues temporary credentials scoped to specific AWS Services
limps
limps your Local Intelligent MCP Planning Server across AI assistants. No subscriptions, no cloud—run it locally. Version control your planning docs in git. No more context drift—one shared source of truth across Claude, Cursor, Copilot, and any MCP tool.