spring-ai-summary
SpringAI,LLM,MCP,Embedding
claude mcp add --transport stdio java-ai-tech-spring-ai-summary docker run -i java-ai-tech/spring-ai-summary:latest
How to use
Spring AI Summary is a modular MCP server example built on the Spring AI framework. It aggregates and demonstrates core Spring AI capabilities across multiple modules, including chat interactions, tool invocation, vector/database integration, and advanced agent patterns. The server is designed to expose endpoints and utilities through a Spring Boot application, enabling you to experiment with conversation flows, tool calls, and retrieval-augmented generation (RAG) patterns. Once running, you can interact with the provided modules or inspect actuator endpoints to monitor metrics and health as you build AI-enabled applications.
Typical usage involves starting the server (via Docker, or by running the packaged JAR in a Java environment) and then using HTTP clients (curl, HTTPie, Postman) to exercise the available endpoints. The README highlights modules such as spring-ai-chat and spring-ai-tool-calling, which illustrate how to deploy a chat interface and invoke external tools from within a conversational agent. The project also points to a wiki with module-specific READMEs for deeper exploration of configuration and usage, making it a practical reference for implementing MCP Client-Server patterns with Spring AI.
How to install
Prerequisites:
- Java JDK 21+ (or a Java 21 compatible runtime)
- Maven 3.6+ (mvnd recommended for faster builds)
- Docker (optional, for containerized runs)
- Internet access for dependencies
Installation steps:
- Clone the repository
git clone https://github.com/java-ai-tech/spring-ai-summary.git
cd spring-ai-summary
- Build the project (skip tests for faster iteration)
# Using Maven
mvn clean package -DskipTests
- Run the application (examples)
- If you prefer running locally without Docker:
# After building, navigate to the desired module and run the Spring Boot app
# Example (adjust path/module as needed):
java -jar spring-ai-summary/spring-ai-chat/spring-ai-chat-deepseek/target/spring-ai-chat-deepseek-1.0.0.jar
- If you prefer containerized execution (Docker):
docker build -t spring-ai-summary:latest .
docker run -i --rm -p 8081:8081 spring-ai-summary:latest
- Configure per-module API keys (example shown in the README for a DeepSeek module). Set environment variables in your runtime configuration, such as:
SPRING_AI_DEEPSEEK_API_KEY=your_api_key_here
SPRING_AI_DEEPSEEK_BASE_URL=https://api.deepseek.com
SPRING_AI_DEEPSEEK_CHAT_COMPLETIONS_PATH=/v1/chat/completions
Refer to the wiki and module READMEs for module-specific configuration details and additional environment variables.
Additional notes
Notes and tips:
- The project encourages storing API keys in environment variables to avoid leaking secrets in code.
- If Maven dependency downloads are slow, consider using a domestic Maven mirror (e.g., Alibaba Cloud, Tsinghua) as noted in the README.
- Docker usage may require configuring ports (default 8081 for some Spring Boot modules) and exposing necessary service endpoints for interacting with modules like spring-ai-chat-deepseek.
- The README references a Wiki page with extended guidance; consult it for module-specific setup, examples, and troubleshooting.
- When running via Docker, ensure you pull the latest image or build locally to reflect recent changes from the repository.
- If you encounter port conflicts, adjust the server port in application configuration or runtime arguments accordingly.
Related MCP Servers
FastGPT
FastGPT is a knowledge-based platform built on the LLMs, offers a comprehensive suite of out-of-the-box capabilities such as data processing, RAG retrieval, and visual AI workflow orchestration, letting you easily develop and deploy complex question-answering systems without the need for extensive setup or configuration.
SearChat
Search + Chat = SearChat(AI Chat with Search), Support OpenAI/Anthropic/VertexAI/Gemini, DeepResearch, SearXNG, Docker. AI对话式搜索引擎,支持DeepResearch, 支持OpenAI/Anthropic/VertexAI/Gemini接口、聚合搜索引擎SearXNG,支持Docker一键部署。
ai4j
一款JavaSDK用于快速接入AI大模型应用,整合多平台大模型,如OpenAi、智谱Zhipu(ChatGLM)、深度求索DeepSeek、月之暗面Moonshot(Kimi)、腾讯混元Hunyuan、零一万物(01)等等,提供统一的输入输出(对齐OpenAi)消除差异化,优化函数调用(Tool Call),优化RAG调用、支持向量数据库(Pinecone)、内置联网增强,并且支持JDK1.8,为用户提供快速整合AI的能力。
daan
✨Lightweight LLM Client with MCP 🔌 & Characters 👤
AgentNexus
Multi-Agent,MCP,RAG,SpringAI1.0.0,RE-ACT
DeepCo
A Chat Client for LLMs, written in Compose Multiplatform.