Get the FREE Ultimate OpenClaw Setup Guide →

replicate-streamable

MCP Server for interacting with public models on Replicate. Written in TypeScript, Node and Hono.dev

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio iceener-replicate-streamable-mcp-server node src/server.js \
  --env PORT="3000" \
  --env API_KEY="your-random-auth-token" \
  --env REPLICATE_API_TOKEN="r8_your_replicate_token_here"

How to use

Replicate MCP Server provides two focused tools tailored for AI image generation workflows: search_models and generate_image. It exposes a compact API surface designed for LLM-powered agents: you can discover supported image-generation models along with their exact input schemas, then run predictions to generate or edit images. The server authenticates clients with a simple Bearer token or an Api-Key header, and it communicates with Replicate using your API token to fetch model schemas and perform predictions. When integrating with an assistant, you should predefine preferred models and common parameters (such as aspect ratios) to minimize back-and-forth tool calls and produce markdown-ready results with expiring image URLs.

To use the tools, first search models via search_models to retrieve models and their required inputs. Then invoke generate_image with the chosen model and an input payload that matches the model’s schema. The response includes a generated image URL (expiring in a short window) and a summary of the run parameters, enabling you to relay results back to users in a concise, human-friendly format. The design emphasizes schema-awareness and clarity, so the agent always receives concrete parameter details for each model.

How to install

Prerequisites:

  • Node.js environment (or Bun) and a Git client
  • A Replicate account with an API token
  • (Optional) Cloudflare Workers environment if you plan to deploy remotely

Step-by-step:

  1. Clone the repository git clone https://github.com/iceener/replicate-streamable-mcp-server.git cd replicate-streamable-mcp-server

  2. Install dependencies

    If using Bun (recommended by the project):

    bun install

    If using npm/yarn as an alternative:

    npm install

  3. Create and configure environment

    • Copy example env and set tokens cp env.example .env editor .env

    Ensure the following variables are set:

    PORT=3000 API_KEY=your-random-auth-token REPLICATE_API_TOKEN=<your-replicate-api-token>

  4. Run the server (local development) bun dev

    MCP endpoint will be available at: http://127.0.0.1:3000/mcp

  5. Optional: test with MCP Inspector or a client configuration bunx @modelcontextprotocol/inspector

    Connect to: http://localhost:3000/mcp (local)

  6. If you plan to deploy to Cloudflare Workers, follow the Cloudflare deployment steps in the README (wrangler setup, secrets, and deployment commands).

Additional notes

Tips and common considerations:

  • Use a strong API_KEY to prevent unauthorized access to the MCP interface.
  • Keep your REPLICATE_API_TOKEN secret; do not expose it in client configurations.
  • The mcpServers configuration shown in the README demonstrates a minimal setup. You can customize the server name and the command/args if you adapt to Bun-based execution or different runtimes.
  • When testing locally, ensure PORT and HOST are reachable by your client integration so that MCP requests are routed correctly.
  • The HTTP endpoints support POST /mcp for JSON-RPC 2.0 calls and GET /health for liveness checks.
  • If you switch between local development and Cloudflare Worker deployments, be mindful of CORS and authentication headers in client requests.

Related MCP Servers

Sponsor this space

Reach thousands of developers