Get the FREE Ultimate OpenClaw Setup Guide →

mcp -client-demo

Streamable HTTP based MCP server and Client demo with auto registry, Dockerfile setup and env.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio s1lv3rj1nx-mcp-server-client-demo docker run -i mcp-server-client-demo

How to use

This MCP server provides a stateless, streamable HTTP transport-based implementation of the Model Context Protocol (MCP) with an auto tool registry. It enables LLMs to connect to external data sources and tools in a scalable way. The server registers tools automatically via the @mcp_tool decorator, simplifying how you expose capabilities to clients. You can run the server locally or in Docker, and deploy it to cloud environments as part of a larger MCP-enabled pipeline.

To use the server, start it using your preferred method (Docker or uv-based local run). Once running, clients (including an OpenAI SDK-based MCP client in this repository) can discover and invoke registered tools, pass context, and stream results back to the client. The client demo in this repo demonstrates how to integrate with an OpenAI-based workflow and interact with the server’s tool registry to perform tasks such as querying external data, invoking functions, and streaming responses in real time.

How to install

Prerequisites:

  • Docker (recommended for quick setup) or Python with uv (if you prefer running locally)
  • Git

Option A: Using Docker

  1. Install Docker from https://www.docker.com/
  2. Pull the demo image (or build if you have a Dockerfile in this repo): docker pull mcp-server-client-demo
  3. Run the server: docker run -i mcp-server-client-demo

Option B: Local development with uv (Python)

  1. Install uv: curl -LsSf https://astral.sh/uv/install.sh | sh
  2. Sync dependencies (from the repo root): uv sync
  3. Run the server locally (adjust as needed if there is a specific Python entrypoint in this repo): uv run

Option C: Dockerfile-based deployment (alternative guidance)

  1. Build the image using the repository’s Dockerfile: docker build -t mcp-server-client-demo .
  2. Run the container: docker run -i mcp-server-client-demo

Additional notes

Notes and tips:

  • This is a demo MCP server; for production, ensure you enable proper authentication, logging, and rate limiting as appropriate for your deployment.
  • The server auto-registers tools via the @mcp_tool decorator. To add new capabilities, declare them in your code with the decorator and rebuild/redeploy.
  • If you encounter networking issues with Docker, ensure the container has network access to required data sources and, if needed, expose the appropriate ports in your deployment configuration.
  • When using the client in this repository, consult the client/README.md for specific usage patterns with the OpenAI SDK and how to invoke registered tools.
  • Environment variables can be added for configuration (e.g., tool endpoints, API keys). Use the mcp_config/env structure in your deployment to provide descriptions or placeholders.

Related MCP Servers

Sponsor this space

Reach thousands of developers