mcp -runner
A WebSocket server implementation for running Model Context Protocol (MCP) servers. This application enables MCP servers to be accessed via WebSocket connections, facilitating integration with web applications and other network-enabled clients.
claude mcp add --transport stdio yonaka15-mcp-server-runner cargo run \ --env ARGS="optional-args-for-mcp-server" \ --env HOST="0.0.0.0" \ --env PORT="8080" \ --env PROGRAM="path/to/mcp-server-executable" \ --env CONFIG_FILE=""
How to use
MCP Server Runner is a WebSocket bridge that lets clients connect to an MCP server implementation through a WebSocket interface. It launches the MCP server process, forwards messages between the WebSocket client and the MCP server, and handles lifecycle events such as startup, errors, and shutdown. The runner supports a single connected client at a time and passes through standard error output from the MCP server for visibility in logs. To use it, configure which MCP server executable to run (via environment variables or a config file) and start the runner. Clients connect over WebSocket (for example ws://<host>:<port>) and exchange MCP messages as defined by the Model Context Protocol, enabling web apps to interact with MCP servers without embedding the server directly in the frontend.
How to install
Prerequisites:
- Rust toolchain (Rust 1.70 or higher) with cargo installed
- An MCP server executable to launch from the runner (or a path to build one)
Installation steps:
-
Clone the repository: git clone <repository-url> cd mcp-server-runner
-
Build the runner (Rust/Cargo): cargo build --release
The compiled binary will be in target/release/
-
Prepare a sample MCP server executable path. If you don’t have one yet, build or place your MCP server binary somewhere accessible and note its path.
-
Run with environment configuration (example): export PROGRAM=/path/to/mcp-server-executable export ARGS="--option1 value1 --option2 value2" export HOST=0.0.0.0 export PORT=8080 cargo run --release
-
Alternative: provide a JSON config file and point the runner at CONFIG_FILE, then run accordingly (as described in the README): CONFIG_FILE=config.json cargo run --release
Additional notes
Notes and tips:
- The runner currently supports a single WebSocket client connection at a time; plan for multi-client support if needed.
- If you’re exposing the WebSocket server publicly, consider placing a reverse proxy (e.g., Nginx) with TLS termination in front of port 8080.
- Any environment variables prefixed with PIN or SECRET should be handled carefully; avoid logging sensitive values.
- The MCP server executable’s stdout/stderr will be logged by the runner; use this for debugging MCP-side issues.
- If you rely on a configuration file, ensure CONFIG_FILE points to a valid JSON with the expected structure and that the specified MCP server binary can be found via PROGRAM/ARGS when using environment-based configuration.
Related MCP Servers
goose
an open source, extensible AI agent that goes beyond code suggestions - install, execute, edit, and test with any LLM
cunzhi
告别AI提前终止烦恼,助力AI更加持久
probe
AI-friendly semantic code search engine for large codebases. Combines ripgrep speed with tree-sitter AST parsing. Powers AI coding assistants with precise, context-aware code understanding.
mcp-center
A centralized platform for managing and connecting MCP servers. MCP Center provides a high-performance proxy service that enables seamless communication between MCP clients and multiple MCP servers.
backlog -rust
MCP server for Backlog, project management service.
perplexity-web-api
🔍 Perplexity AI MCP without API key