Get the FREE Ultimate OpenClaw Setup Guide →

spring-ai-summary

SpringAI,LLM,MCP,Embedding

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio java-ai-tech-spring-ai-summary docker run -i java-ai-tech/spring-ai-summary:latest

How to use

Spring AI Summary is a modular MCP server example built on the Spring AI framework. It aggregates and demonstrates core Spring AI capabilities across multiple modules, including chat interactions, tool invocation, vector/database integration, and advanced agent patterns. The server is designed to expose endpoints and utilities through a Spring Boot application, enabling you to experiment with conversation flows, tool calls, and retrieval-augmented generation (RAG) patterns. Once running, you can interact with the provided modules or inspect actuator endpoints to monitor metrics and health as you build AI-enabled applications.

Typical usage involves starting the server (via Docker, or by running the packaged JAR in a Java environment) and then using HTTP clients (curl, HTTPie, Postman) to exercise the available endpoints. The README highlights modules such as spring-ai-chat and spring-ai-tool-calling, which illustrate how to deploy a chat interface and invoke external tools from within a conversational agent. The project also points to a wiki with module-specific READMEs for deeper exploration of configuration and usage, making it a practical reference for implementing MCP Client-Server patterns with Spring AI.

How to install

Prerequisites:

  • Java JDK 21+ (or a Java 21 compatible runtime)
  • Maven 3.6+ (mvnd recommended for faster builds)
  • Docker (optional, for containerized runs)
  • Internet access for dependencies

Installation steps:

  1. Clone the repository
git clone https://github.com/java-ai-tech/spring-ai-summary.git
cd spring-ai-summary
  1. Build the project (skip tests for faster iteration)
# Using Maven
mvn clean package -DskipTests
  1. Run the application (examples)
  • If you prefer running locally without Docker:
# After building, navigate to the desired module and run the Spring Boot app
# Example (adjust path/module as needed):
java -jar spring-ai-summary/spring-ai-chat/spring-ai-chat-deepseek/target/spring-ai-chat-deepseek-1.0.0.jar
  • If you prefer containerized execution (Docker):
docker build -t spring-ai-summary:latest .
docker run -i --rm -p 8081:8081 spring-ai-summary:latest
  1. Configure per-module API keys (example shown in the README for a DeepSeek module). Set environment variables in your runtime configuration, such as:
SPRING_AI_DEEPSEEK_API_KEY=your_api_key_here
SPRING_AI_DEEPSEEK_BASE_URL=https://api.deepseek.com
SPRING_AI_DEEPSEEK_CHAT_COMPLETIONS_PATH=/v1/chat/completions

Refer to the wiki and module READMEs for module-specific configuration details and additional environment variables.

Additional notes

Notes and tips:

  • The project encourages storing API keys in environment variables to avoid leaking secrets in code.
  • If Maven dependency downloads are slow, consider using a domestic Maven mirror (e.g., Alibaba Cloud, Tsinghua) as noted in the README.
  • Docker usage may require configuring ports (default 8081 for some Spring Boot modules) and exposing necessary service endpoints for interacting with modules like spring-ai-chat-deepseek.
  • The README references a Wiki page with extended guidance; consult it for module-specific setup, examples, and troubleshooting.
  • When running via Docker, ensure you pull the latest image or build locally to reflect recent changes from the repository.
  • If you encounter port conflicts, adjust the server port in application configuration or runtime arguments accordingly.

Related MCP Servers

Sponsor this space

Reach thousands of developers