mcproc
A Model Context Protocol (MCP) server for comfortable background process management on AI agents.
claude mcp add --transport stdio neptaco-mcproc mcproc mcp serve
How to use
mcproc is an MCP server that centralizes and manages long-running background processes for AI agents and developers. It provides a unified CLI and MCP exposure so agents can start, monitor, and control development servers, build watchers, daemons, and other background tasks while you retain full visibility and control from the terminal. The server offers tools to start, stop, restart, and inspect processes, as well as robust log handling with search and filtering, and integration with version-managers for different runtimes. Once registered as an MCP server, AI agents can invoke start_process, stop_process, restart_process, list_processes, get_process_logs, search_process_logs, and get_process_status to orchestrate development workflows.
With mcproc, you get: (1) Unified process management across AI agents and CLI users; (2) Intelligent log capture, persistence, and powerful search with regex and time-based filtering; (3) Project-aware grouping and real-time log following; (4) Support for toolchains and version managers (nvm, asdf, mise, rbenv, etc.); (5) Convenient commands for starting daemons, cleaning a project, and inspecting process state. Developers can also monitor processes locally via the CLI using commands like mcproc logs -f and mcproc ps to keep visibility while the AI agents operate in the background.
How to install
Prerequisites:
- A supported system (macOS, Linux, or Windows with WSL) with a suitable Rust toolchain installed for building from source.
- protobuf compiler if building from source for any generated bindings.
Install via Homebrew (macOS and Linux):
# Add the tap
brew tap neptaco/tap
# Install mcproc
brew install mcproc
Build from source:
git clone https://github.com/neptaco/mcproc.git
cd mcproc
cargo build --release
# Optional: install to PATH
cargo install --path mcproc
If you install via the binary, you can skip the build steps and just execute mcproc commands directly once installed.
Additional notes
Notes and tips:
- mcproc is designed to be run as an MCP server so AI agents can manipulate long-running processes through the MCP interface. Register it with your MCP client as shown in the README example.
- To ensure readiness synchronization, you can use --wait-for-log patterns when starting processes; mcproc will wait for the specified log line before reporting the process as ready.
- Environment variables for started processes are supported via the -e/--env flag (KEY=VAL). Use this to customize runtimes per project.
- The tool supports multiple projects and automatic cleanup of child processes when stopping a parent process, as well as a dedicated daemon for background management.
- If you run into permission issues, ensure mcproc has appropriate read/write permissions in the directories used for logs and state (often within XDG base directories).
- When integrating with MCP clients, you can place mcproc into your mcpServers configuration as shown in the README example to enable discoverability and control from AI agents.
Related MCP Servers
claude-talk-to-figma
A Model Context Protocol (MCP) that allows Claude Desktop and other AI tools (Claude Code, Cursor, Antigravity, etc.) to read, analyze, and modify Figma designs
agentql
Model Context Protocol server that integrates AgentQL's data extraction capabilities.
ollama
An MCP Server for Ollama
cursor-rust-tools
A MCP server to allow the LLM in Cursor to access Rust Analyzer, Crate Docs and Cargo Commands.
statelessagent
Your AI forgets everything between sessions. SAME fixes that. Local-first, no API keys, single binary.
voice-status-report
A Model Context Protocol (MCP) server that provides voice status updates using OpenAI's text-to-speech API.