Get the FREE Ultimate OpenClaw Setup Guide →

spring-ai-playground

A self-hosted web UI that simplifies AI experimentation and testing for Java developers. It provides playgrounds for all major vector databases and MCP tools, supports intuitive interaction with LLMs, and enables rapid development and testing of RAG workflows, MCP integrations, and unified chat experiences.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio jm-lab-spring-ai-playground docker run -d -p 8282:8282 --name spring-ai-playground -e SPRING_AI_OLLAMA_BASE_URL=http://host.docker.internal:11434 -v spring-ai-playground:/home --restart unless-stopped jmlab/spring-ai-playground:latest \
  --env SPRING_AI_OLLAMA_BASE_URL="http://host.docker.internal:11434"

How to use

Spring AI Playground is a self-hosted web UI that provides an integrated environment for experimenting with large language models, vector databases, prompt engineering, and MCP (Model Context Protocol) integrations. It supports multiple model providers (including Ollama locally and major cloud providers) and includes tooling to test retrieval-augmented generation (RAG) workflows, manage model configurations, and evaluate MCP transports. The MCP playground area lets you explore conversations and data flows that involve external tools and context sharing, making it easier to prototype Spring AI-based applications with richer contextual awareness.

To use it with Docker as described, start the container and open the UI in your browser at the exposed port (default: http://localhost:8282). The UI exposes model selection (Ollama or other providers), embeddings, RAG workflows, and MCP-specific panels that show how context is managed and transported between components. You can configure which chat and embedding models are available, adjust default options, and experiment with different MCP transports to see how context is threaded through prompts and tool calls.

How to install

Prerequisites:

  • Docker installed and running
  • Optional: Java 21+ if building locally from source

Option A: Run with Docker (recommended)

  1. Clone the repository (optional for reference): git clone https://github.com/JM-Lab/spring-ai-playground.git cd spring-ai-playground
  2. Build and run the Docker image (as shown in Quick Start): ./mvnw spring-boot:build-image -Pproduction -DskipTests=true
    -Dspring-boot.build-image.imageName=jmlab/spring-ai-playground:latest docker run -d -p 8282:8282 --name spring-ai-playground
    -e SPRING_AI_OLLAMA_BASE_URL=http://host.docker.internal:11434
    -v spring-ai-playground:/home
    --restart unless-stopped
    jmlab/spring-ai-playground:latest
  3. Open http://localhost:8282 in your browser.

Option B: Run locally from source (without Docker)

  1. Ensure Java 21+ is installed.
  2. Build the project: ./mvnw clean install -Pproduction -DskipTests=true
  3. Run the application locally: ./mvnw spring-boot:run
  4. Open http://localhost:8282 in your browser.

Additional notes

Environment variables and configuration options:

  • SPRING_AI_OLLAMA_BASE_URL must point to your Ollama instance when using Ollama locally or in Docker networking scenarios. The recommended value for Docker runs is http://host.docker.internal:11434, but you may need to adjust for Linux hosts (e.g., using the host IP or --network=host).
  • The MCP transport (STDIO) may require running outside of containers for reliable direct process-to-process communication. If testing MCP STDIO transport, consider running locally instead of in Docker.
  • Data persistence is achieved via a Docker volume (spring-ai-playground in the example); ensure volume backups if needed. If running locally, data will be stored in your project directory unless configured otherwise.
  • If you modify model sources (e.g., switching from Ollama to OpenAI), update dependencies in pom.xml accordingly as described in the OpenAI switching notes in the README.

Related MCP Servers

Sponsor this space

Reach thousands of developers