nchan -transport
The best way to deploy mcp server. A high-performance WebSocket/SSE transport layer & gateway for Anthropic's MCP (Model Context Protocol) — powered by Nginx, Nchan, and FastAPI.
claude mcp add --transport stdio conechoai-nchan-mcp-transport docker compose up -d \ --env DOCKER_COMPOSE_FILE="docker-compose.yml"
How to use
Nchan MCP Transport provides a real-time API gateway that lets MCP clients (such as Claude) communicate with your tools over WebSocket or Server-Sent Events (SSE). It is built on Nginx + Nchan for high-performance pub/sub and uses FastAPI for backend logic and OpenAPI tooling. This setup allows you to register tools with Python decorators, expose an OpenAPI service to Claude, and deploy GPTs Actions to MCP servers with a CLI. To use it, start the Docker deployment as described in the installation steps, then define your tools with the provided @server.tool() decorator in Python. You can expose an OpenAPI service so Claude can discover and call your endpoints, and optionally deploy GPTs Actions to MCP using the httmcp CLI for rapid integration. The gateway transparently handles WebSocket or SSE connections, providing live progress updates and real-time event delivery to Claude clients.
How to install
Prerequisites:
- Docker and Docker Compose installed
- Python 3.9+ (for local development if you choose to run outside Docker)
- Optional: access to an OpenAPI spec for automatic tool exposure
Installation steps:
-
Clone the repository (or prepare your deployment directory): git clone https://github.com/yourusername/nchan-mcp-transport.git cd nchan-mcp-transport
-
Install the Python SDK (if you plan to run locally): python -m pip install httmcp
-
Build and run with Docker Compose (as per Quickstart): docker-compose up -d
-
Optional: run a quick test by starting a minimal MCP server in Python and registering a tool: from httmcp import HTTMCPServer
server = HTTMCPServer() @server.tool() async def ping() -> str: return "pong"
Run the server according to your environment (note: this section is for local development and testing).
-
If you have an OpenAPI spec, you can expose it via the FastAPI app as shown in the README examples, and then connect Claude to your OpenAPI-powered MCP gateway.
Additional notes
Tips and considerations:
- This project emphasizes real-time, low-latency MCP transport via WebSocket/SSE; ensure your Nginx/Nchan deployment is correctly configured in the Docker image used.
- The OpenAPI bridge allows you to automatically generate MCP tools from an OpenAPI spec, simplifying tool exposure to Claude.
- The HTTMCP CLI enables one-click deployment of GPTs Actions to the MCP server; install with: pip install httmcp[cli] and run the provided command to publish actions.
- If you encounter connection drops or long-running tasks, verify your Docker resource limits and network configuration; the architecture is designed for high concurrency but requires proper container sizing.
- Environment variables can be used to tweak endpoints, OpenAPI publication targets, and Docker behavior (example: OPENAPI_URL, PUBLISH_SERVER).