Get the FREE Ultimate OpenClaw Setup Guide →

spring-ai-playground

Spring AI Playground is a self-hosted web UI for low-code AI tool development with live MCP server registration. It includes MCP server inspection, agentic chat, and integrated LLM and RAG workflows, enabling real-time experimentation and evolution of tool-enabled AI systems without redeployment.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio spring-ai-community-spring-ai-playground docker run -d -p 8282:8282 --name spring-ai-playground -e SPRING_AI_OLLAMA_BASE_URL=http://host.docker.internal:11434 -v spring-ai-playground:/home --restart unless-stopped jmlab/spring-ai-playground:latest \
  --env SPRING_AI_OLLAMA_BASE_URL="http://host.docker.internal:11434"

How to use

Spring AI Playground is a self-hosted web UI that brings together large language models, vector databases, prompt engineering, and MCP (Model Context Protocol) integrations in a unified, testable environment. It runs on Spring AI and can connect to Ollama locally, OpenAI, and other providers, enabling retrieval-augmented generation (RAG) workflows and MCP-based tool integration. The MCP playground portions let you experiment with model-context sharing, context windows, and external tool calls in a workflow that mirrors real MCP scenarios. You can use it to prototype MCP-enabled Spring AI-based applications and validate MCP transport and tool usage within a unified UI.

Once running, you can interact with the MCP features from the dedicated MCP Playground area, where you can configure model providers, test context handling, and observe how tools and plugins are invoked by your chosen MCP setup. The environment also includes examples and configurations showing how to switch from Ollama to OpenAI or other providers, and how to plug in vector databases for retrieval-based answers. This makes it a practical sandbox for testing MCP integrations and RAG pipelines end-to-end.

How to install

Prerequisites

  • Java 21+ installed (required for building/running the project)
  • Ollama installed and running if you plan to use local LLMs
  • Docker installed and running (recommended for quick start) or a local Java build setup

Install and Run (Docker, Recommended)

  1. Clone the repository and navigate to the project directory:
git clone https://github.com/spring-ai-community/spring-ai-playground.git
cd spring-ai-playground
  1. Build the Docker image and run the container (as described in the README):
./mvnw spring-boot:build-image -Pproduction -DskipTests=true
  1. Start the application via Docker (example from README):
docker run -d -p 8282:8282 --name spring-ai-playground \
-e SPRING_AI_OLLAMA_BASE_URL=http://host.docker.internal:11434 \
-v spring-ai-playground:/home \
--restart unless-stopped \
jmlab/spring-ai-playground:latest
  1. Open the UI at http://localhost:8282

Alternative Local Run (without Docker)

  1. Build and run the Spring Boot application locally:
./mvnw clean install -Pproduction -DskipTests=true
./mvnw spring-boot:run
  1. Access the application at http://localhost:8282

Additional notes

  • The MCP STDIO transport has known limitations when using Docker: MCP STDIO requires direct process-to-process communication, which containerized environments may not reliably provide. For MCP STDIO testing, consider running Locally (Optional) as described in the README.
  • Data persists inside the Docker volume spring-ai-playground. If you remove the container, data remains in the volume unless you explicitly remove it.
  • The Ollama base URL must correctly point to the Ollama instance. In Docker, this is often http://host.docker.internal:11434, but you may need to adjust for Linux environments (e.g., use the host IP or --network=host).
  • If you switch to OpenAI or other providers, you will need to update dependencies in pom.xml and adjust configuration accordingly in the application.
  • PWA installation requires completing either the Docker or Local installation steps first; the PWA can then be installed as a standalone app on supported devices.

Related MCP Servers

Sponsor this space

Reach thousands of developers