Get the FREE Ultimate OpenClaw Setup Guide →

mcp-client -azure

MCP server from sp1986sp/mcp-client-server-azure

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio sp1986sp-mcp-client-server-azure docker run -i mcp-client-azure

How to use

This MCP server demonstrates a modular, context-aware client-server architecture built with Spring Boot and Spring AI, integrated with Azure OpenAI models. The project consists of two Maven modules: a chat client and an AI-powered server. The client exposes a /chat endpoint and forwards prompts to the server, propagating headers and context. The server registers and executes AI-powered tools (such as user management) and propagates context across asynchronous operations, enabling distributed tracing and consistent user experiences. To use it, first ensure you have Java 17+, Maven, and an Azure OpenAI key and endpoint. After building, run the server and client to see a collaborative chat workflow where the client queries AI-powered tools via the server, with context preserved across threads and requests.

How to install

Prerequisites:

  • Java 17+
  • Maven 3.8+
  • Azure OpenAI API key and endpoint

Install steps:

  1. Clone the repository git clone <repo-url> cd mcp-client-server-azure

  2. Build the multi-module project mvn clean install

  3. Run the server (from the project root or using the Maven wrapper) cd server ./mvnw spring-boot:run

    server runs on http://localhost:8081 by default

  4. Run the client (in a new terminal) cd ../client ./mvnw spring-boot:run

  5. Open the client API and test the chat flow at http://localhost:8080/chat (or the configured ports for client).

Configuration notes:

  • Update client properties with your Azure OpenAI credentials in client/src/main/resources/application.properties
  • Update server properties as needed in server/src/main/resources/application.properties

Additional notes

Tips and caveats:

  • Ensure your Azure OpenAI API key and endpoint are correctly configured in the client properties (spring.ai.azure.openai.api-key and spring.ai.azure.openai.endpoint).
  • The system propagates context (headers, MDC, locale, and request attributes) across async boundaries, which is crucial for observability and debugging.
  • If you encounter port conflicts, adjust server.port in server/src/main/resources/application.properties.
  • The architecture supports dynamic tool registration and context-aware responses; explore the server's UserService endpoints for CRUD operations and search.
  • For containerized usage, you can adapt the Docker image name to your own registry and adjust volumes/secrets for Azure credentials.
  • If you modify context propagation components, validate that headers and MDC values propagate through async task execution and tool calls.

Related MCP Servers

Sponsor this space

Reach thousands of developers