MCP-Client -Demo
Demo Project für MCP Spring Client/Server Setup
claude mcp add --transport stdio jsiggelkow-mcp-client-server-demo docker compose up -d \ --env OPENAI_API_KEY="your-openai-api-key" \ --env SPRING_PROFILES_ACTIVE="default"
How to use
This MCP server project is a demonstration of a simple MCP client/server architecture built with Java 21 and Spring Boot. It showcases how a client component can connect to a central MCP server using REST endpoints, with Spring AI integration to enable AI-assisted capabilities. The demo is packaged to run inside Docker, and the configuration indicates the client points to the server at the mcp-server hostname on port 8080. After starting the containers, you can interact with the exposed MCP endpoints through the server's REST API and experiment with the AI-powered features that bridge client requests to the server-side logic.
To use the tools and capabilities, start the docker-based environment from the repository root. The provided docker-compose setup spins up both client and server components in a shared network, using the server URL http://mcp-server:8080 as defined in the client configuration. Once running, you can send MCP commands (e.g., request/response flows or task execution) via the server API and observe how the client sends or receives data, with the OpenAI integration enabling AI-assisted decision making within the server context.
How to install
Prerequisites:
- Docker and Docker Compose installed on your system
- Java 21 and Maven/Gradle are optional for local builds, but the demo is designed to run via Docker
- An OpenAI API key if you wish to enable AI features
Installation steps:
# 1) Clone the repository
git clone <repository-url> && cd <repo-folder>
# 2) Ensure Docker Desktop is running
# 3) Create or update environment/secret values as needed (see notes below)
Configuration steps (for OpenAI key):
# docker-compose or application configuration refers to OpenAI key
# If using a local config, place your key where the Spring Boot app expects it.
# Example: docker-compose can pass environment variables to containers if needed.
OPENAI_API_KEY: your-openai-api-key
Run the application stack:
# From the repository root
docker compose up -d
Verify:
# Check container status
docker ps
# Optional: tail logs if needed
docker logs -f <container-id-or-name>
Additional notes
Notes:
- The client is configured to reach the server at http://mcp-server:8080 within the Docker network. If you customize hostnames or ports, update the client configuration accordingly.
- OpenAI API key can be provided in application configuration or via environment variables (OPENAI_API_KEY) depending on how you deploy the stack.
- If you encounter network or startup issues, ensure the docker-compose services have healthy dependencies and that the server is reachable at the expected URL within the container network.
- This is a demonstration project; for production, consider adding proper secret management, health checks, and authentication for MCP endpoints.