Get the FREE Ultimate OpenClaw Setup Guide →

MCP-Bridge

A middleware to provide an openAI compatible endpoint that can call MCP tools

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio secretiveshell-mcp-bridge uvx mcp-server-fetch

How to use

MCP-Bridge acts as a mediator between the OpenAI API and MCP (Model Context Protocol) tools. It exposes MCP tools to clients via a REST-like/OpenAI-compatible interface and optionally streams results through an SSE bridge. You can list available MCP tools on the server and invoke them via the normal OpenAI-style prompts, with MCP tool calls automatically resolved by the bridge and routed to the underlying MCP servers. This enables you to leverage multiple MCP tools without integrating each tool directly into your client or model prompts. The server supports both non-streaming and streaming chat completions that utilize MCP tools, and it provides endpoints to manage and fetch tool definitions, as well as to handle authentication via API keys if enabled in config.

To use it, configure the MCP servers you want to expose (for example a fetch tool) in the config.json, start the service (docker-compose or manual uvx/uvtok style runs), and then send requests to the bridge through the OpenAI-compatible interface. The bridge will attach tool definitions for all configured MCP tools to incoming requests, forward them to your inference engine, and then incorporate tool results back into the response. You can also connect external clients via the SSE bridge endpoint to receive live tool information and outcomes.

If you are using the Docker deployment, ensure your inference engine (e.g., vLLM) supports tool calls. The docs mention that vLLM is compatible and that Open Web UI can interact with MCP-enabled endpoints via this bridge. You can test connectivity with the provided SSE example (http://yourserver:8000/mcp-server/sse) and a client like wong2/mcp-cli to connect over SSE.

How to install

Prerequisites:

  • Docker and/or Python environment as desired (see options below)
  • Access to the repository containing MCP-Bridge
  • Optional: an inference engine with tool call support (e.g., vLLM)

Option A: Docker (recommended)

  1. Clone the repository:
git clone <repo-url>
cd <repo-directory>
  1. Edit compose.yml to mount or reference config.json as needed (see README for examples). You may provide the config via a file mount, an HTTP URL, or an inline JSON environment variable per the examples in the README.
  2. Start the service:
docker-compose up --build -d
  1. Verify the service is running and accessible at the configured host/port (default 0.0.0.0:9090 or as set in your config).

Option B: Manual installation (no Docker)

  1. Clone the repository:
git clone <repo-url>
cd <repo-directory>
  1. Install dependencies (Python environment as per uv/uvx workflow):
uv sync
  1. Create a config.json file in the repository root (see README for an example).
  2. Run the application:
uv run mcp_bridge/main.py
  1. Access the API at your configured host/port and test tool availability via the provided endpoints.

Note: The README shows example JSON for config and demonstrates how to embed config via environment variables in Docker, so adapt steps to your deployment preference.

Additional notes

Tips and caveats:

  • The recommended deployment method is Docker, but a manual Python/uvx setup is supported if you prefer. Ensure your inference engine supports tool calls.
  • You can enable API key authentication in config.json under security.auth, and then pass the Authorization: Bearer <key> header with requests.
  • To add new MCP servers, edit the config.json file and add entries under mcp_servers with command and args as shown in examples (uvx for MCP tool fetch in the provided config).
  • The SSE bridge endpoint is at /mcp-server/sse and can be used by external chat apps that support MCP. Tools can also be tested using mcp-cli over SSE.
  • If you encounter issues with tool discovery, verify that the MCP server definitions are correctly loaded in config.json and that the bridge has network access to the MCP servers.
  • If you plan to use Open Web UI or Claude-like clients, you can rely on the bridge to adapt OpenAI-style requests into MCP tool calls and handle results transparently.

Related MCP Servers

Sponsor this space

Reach thousands of developers