mcp -client-demo
Streamable HTTP based MCP server and Client demo with auto registry, Dockerfile setup and env.
claude mcp add --transport stdio s1lv3rj1nx-mcp-server-client-demo docker run -i mcp-server-client-demo
How to use
This MCP server provides a stateless, streamable HTTP transport-based implementation of the Model Context Protocol (MCP) with an auto tool registry. It enables LLMs to connect to external data sources and tools in a scalable way. The server registers tools automatically via the @mcp_tool decorator, simplifying how you expose capabilities to clients. You can run the server locally or in Docker, and deploy it to cloud environments as part of a larger MCP-enabled pipeline.
To use the server, start it using your preferred method (Docker or uv-based local run). Once running, clients (including an OpenAI SDK-based MCP client in this repository) can discover and invoke registered tools, pass context, and stream results back to the client. The client demo in this repo demonstrates how to integrate with an OpenAI-based workflow and interact with the server’s tool registry to perform tasks such as querying external data, invoking functions, and streaming responses in real time.
How to install
Prerequisites:
- Docker (recommended for quick setup) or Python with uv (if you prefer running locally)
- Git
Option A: Using Docker
- Install Docker from https://www.docker.com/
- Pull the demo image (or build if you have a Dockerfile in this repo): docker pull mcp-server-client-demo
- Run the server: docker run -i mcp-server-client-demo
Option B: Local development with uv (Python)
- Install uv: curl -LsSf https://astral.sh/uv/install.sh | sh
- Sync dependencies (from the repo root): uv sync
- Run the server locally (adjust as needed if there is a specific Python entrypoint in this repo): uv run
Option C: Dockerfile-based deployment (alternative guidance)
- Build the image using the repository’s Dockerfile: docker build -t mcp-server-client-demo .
- Run the container: docker run -i mcp-server-client-demo
Additional notes
Notes and tips:
- This is a demo MCP server; for production, ensure you enable proper authentication, logging, and rate limiting as appropriate for your deployment.
- The server auto-registers tools via the @mcp_tool decorator. To add new capabilities, declare them in your code with the decorator and rebuild/redeploy.
- If you encounter networking issues with Docker, ensure the container has network access to required data sources and, if needed, expose the appropriate ports in your deployment configuration.
- When using the client in this repository, consult the client/README.md for specific usage patterns with the OpenAI SDK and how to invoke registered tools.
- Environment variables can be added for configuration (e.g., tool endpoints, API keys). Use the mcp_config/env structure in your deployment to provide descriptions or placeholders.
Related MCP Servers
mcp-vegalite
MCP server from isaacwasserman/mcp-vegalite-server
github-chat
A Model Context Protocol (MCP) for analyzing and querying GitHub repositories using the GitHub Chat API.
nautex
MCP server for guiding Coding Agents via end-to-end requirements to implementation plan pipeline
pagerduty
PagerDuty's official local MCP (Model Context Protocol) server which provides tools to interact with your PagerDuty account directly from your MCP-enabled client.
futu-stock
mcp server for futuniuniu stock
mcp -boilerplate
Boilerplate using one of the 'better' ways to build MCP Servers. Written using FastMCP