Get the FREE Ultimate OpenClaw Setup Guide →

linkly-ai -remote-worker

A tunnel has been established to expose LinklyAI’s local MCP service to the public network, enabling the invocation of chatbots like ChatGPT and Claude.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio linklyai-linkly-ai-mcp-remote-worker pnpm run dev

How to use

This MCP server implements Linkly AI's Remote MCP Worker, which exposes your local MCP (Model Context Protocol) endpoint to remote MCP clients through a Cloudflare Workers-based tunnel. The worker acts as a reverse proxy and uses a WebSocket tunnel to securely bridge requests from remote clients (like ChatGPT or Claude) to your local MCP server running on your desktop. Once deployed and connected, remote MCP clients can post to /mcp on the worker URL, and the worker forwards the request through the WebSocket to your desktop application, which then forwards it to your local MCP server (e.g., at 127.0.0.1:60606/mcp). The response flows back through the same path. To use it, deploy the Cloudflare Worker, obtain the worker URL, and configure your MCP client to point to that URL with the /mcp path. You can also test the connection from your Linkly AI Desktop by entering the worker URL (without the https:// prefix) in the MCP settings and verifying via the Test button.

What you can do with this MCP server:

  • Expose a local MCP server behind Cloudflare Workers for remote access
  • Use WebSocket-based tunneling with Durable Objects to keep a persistent connection to your desktop
  • Configure MCP clients (Claude Desktop, Cursor, etc.) to target the remote endpoint and load MCP configurations from JSON files
  • Monitor status and health via the worker's /health endpoint and related API endpoints

How to install

Prerequisites:

  • Node.js (recommended latest LTS) and pnpm installed on your development machine
  • A Cloudflare account with Workers access and a domain or zone to deploy to (or use the workers.dev subdomain)
  • Wrangler CLI (via Cloudflare) for local testing and deployment

Installation steps:

  1. Clone the repository git clone https://github.com/LinklyAI/linkly-ai-mcp-remote-worker.git cd linkly-ai-mcp-remote-worker

  2. Install dependencies pnpm install

  3. Login and configure Cloudflare Wrangler npx wrangler login

  4. Start local development server (for testing) pnpm run dev

    This will start a local development server (e.g., http://localhost:8787) via Wrangler/Workers

  5. Deploy to Cloudflare Workers (requires Cloudflare account) pnpm run deploy

  6. Retrieve the deployed worker URL from the Cloudflare dashboard or the output after deployment

    Example: https://linkly-ai-mcp-remote.your-namespace.workers.dev

  7. Configure Linkly AI Desktop and MCP clients

    • In Linkly AI Desktop: Settings > MCP, enter the worker URL (without https://), Test, then Save
    • In Claude Desktop or Cursor, update the MCP configuration JSON to point to https://<worker-url>/mcp

Notes:

  • When testing locally, ensure your local MCP server is running and reachable at the configured address (e.g., 127.0.0.1:60606/mcp)
  • The /health and /mcp endpoints are exposed by the deployed worker for monitoring and proxying MCP requests

Additional notes

Tips and common issues:

  • Authentication: The current worker version may not implement authentication; protect access using Cloudflare Access or API keys if needed for production.
  • WebSocket stability: The Durable Object tunnel can hibernate to save costs; ensure your desktop MCP client maintains a stable WebSocket connection for best reliability.
  • Endpoint configuration: Always include the full remote endpoint URL for MCP clients, with the /mcp path appended.
  • Costs: Cloudflare Durable Objects and WebSocket connections incur costs; review your plan limits for long-running usage.
  • Troubleshooting: If MCP requests fail, check the worker's /health endpoint, verify the WebSocket connection is established, and confirm your local MCP server is listening on the expected port.

Related MCP Servers

Sponsor this space

Reach thousands of developers