mcp s
MCP server from dimetron/mcp-servers
claude mcp add --transport stdio dimetron-mcp-servers docker run -i image-name-for-time
How to use
This MCP server collection provides three specialized MCP servers designed to demonstrate core capabilities of the Model Context Protocol within the MCP Playground. The memory server offers a knowledge-graph based persistent memory layer that lets Claude or other agents remember information across conversations. The time server provides timezone-aware time lookups and conversions using IANA names, useful for orchestrating time-relative reasoning. The sequentialthinking server facilitates structured problem solving with step-by-step thinking, revision, and branching to explore alternative solution paths. Each server is designed to integrate with the central Agent Gateway and can be composed with various SDK examples to build end-to-end multi-agent workflows.
To use these servers, deploy them via Docker (as described in the installation instructions), then connect your MCP-enabled agents to the corresponding server endpoints through the MCP Gateway. The Memory server will persist entities, relations, and observations to support longer context retention across sessions. The Time server can answer time queries and perform conversions between time zones. The SequentialThinking server allows agents to decompose problems into steps, revise approaches, and explore branching solutions. When combined with the provided SDK examples (Python, Go, Java) and the Docker-based infrastructure, you can prototype multi-agent collaboration workflows such as fact-checking, marketing strategy generation, and complex reasoning tasks.
How to install
Prerequisites:
- Docker Desktop 4.43.0+ or Docker Engine
- Docker Compose 2.38.1+ (for Linux Docker Engine users)
- GPU-enabled system recommended for local model inference (if using GPU-backed inference via your chosen backends)
Installation steps:
-
Clone or download the MCP Playground repository to your local machine.
-
Ensure Docker and Docker Compose are installed and available in your PATH.
-
Build and run the multi-component environment:
docker compose up --build
-
If you want to build individual MCP server images manually, navigate to the corresponding directories and build:
cd docker/mcp/memory docker build -t mcp/memory .
cd docker/mcp/time docker build -t mcp/time .
cd docker/mcp/sequentialthinking docker build -t mcp/sequentialthinking .
-
Access the Agent Gateway UI at the configured port (as described in the repository's infrastructure docs) and verify that the MCP servers are connected and functional.
Prerequisites recap:
- Docker and Docker Compose installed
- Access to the project’s Docker images or ability to build them locally
- Optional: GPU-enabled environment for accelerated inference when using certain backends
Additional notes
Notes and tips:
- The MCP servers are designed to run behind Docker containers and are orchestrated via the central Agent Gateway (port 15000 UI, 10000 MCP per the project setup).
- When customizing, you may replace image-name-for-memory, image-name-for-time, and image-name-for-sequentialthinking with your actual image tags if you maintain your own builds.
- If you encounter connectivity issues between agents and the gateway, ensure docker networking allows the containers to reach the gateway’s exposed ports and that the MCP endpoints are correctly configured.
- For inference backends, you can switch between Local Models, OpenAI integration, or Docker offload by adjusting environment variables or runtime options as described in the readme section on Inference Options.
- Regularly prune unused images to free up disk space using the provided cleanup commands in the repository.
- Review individual component READMEs for detailed architecture, interaction examples, and customization options.
Related MCP Servers
mcp-vegalite
MCP server from isaacwasserman/mcp-vegalite-server
github-chat
A Model Context Protocol (MCP) for analyzing and querying GitHub repositories using the GitHub Chat API.
nautex
MCP server for guiding Coding Agents via end-to-end requirements to implementation plan pipeline
pagerduty
PagerDuty's official local MCP (Model Context Protocol) server which provides tools to interact with your PagerDuty account directly from your MCP-enabled client.
futu-stock
mcp server for futuniuniu stock
mcp -boilerplate
Boilerplate using one of the 'better' ways to build MCP Servers. Written using FastMCP