liberty
Demo of Liberty serving as an MCP Server
claude mcp add --transport stdio mbroz2-liberty-mcp-server mvnw -f mcp-liberty-server/pom.xml liberty:dev
How to use
Liberty is an MCP Server that exposes a weather forecast tool (getForecast) which can be invoked by MCP clients such as a Quarkus-based chatbot. The overall flow is: the client uses an LLM to interpret user queries, and when weather data is required, it calls the Liberty MCP tool to fetch forecasts from the Open-Meteo API. The server runs locally (on port 9080 by default) and the client UI is served by the Quarkus client on port 8080. To use it, start the Quarkus client, then start the Liberty server, and interact with the chat UI to ask weather questions. The MCP tool gets invoked transparently by the client when weather data is needed, and the LLM formats the final answer for the user.
How to install
Prerequisites before you begin:
- Java 17+ installed on your machine
- Maven 3.8.1+ (or rely on the provided Maven wrapper in the repository)
- Ollama installed or access to an OpenAI API key
Installation steps:
-
Clone the repository (or open the project in your IDE): git clone <repo-url> cd <repo-root>
-
Ensure the Maven wrapper is executable (if not already): chmod +x mvnw
-
Build and run the Liberty MCP Server (weather tool):
From the repository root, using the Maven wrapper
./mvnw -f mcp-liberty-server/pom.xml clean package
Then start the Liberty server in dev mode
./mvnw -f mcp-liberty-server/pom.xml liberty:dev
-
Build and run the Quarkus MCP Client (chatbot UI): ./mvnw -f mcp-client/pom.xml clean package ./mvnw -f mcp-client/pom.xml quarkus:dev
-
Verify:
- Liberty server should start on port 9080
- Quarkus client should start on port 8080
- Open http://localhost:8080/ to access the chat UI and test weather questions.
Notes:
- If you prefer Docker, you can containerize the Liberty server and run with Docker, but this guide uses the provided Maven-based run flow.
- If you don’t have Ollama, you can use OpenAI’s API key for the LLM step in the client.
Additional notes
Tips and common issues:
- Ensure Java 17+ is the active JDK in your environment to avoid compilation/runtime issues.
- If you see port conflicts, stop any other services using ports 8080 or 9080, or reconfigure the client/server ports.
- The Liberty server calls the Open-Meteo API to fetch weather data; ensure outbound access is allowed in your network configuration.
- When using Ollama, verify the Ollama service is running and accessible to the Quarkus client for LLM processing.
- The MCP tool in this demo is getForecast; ensure the client’s MCP configuration points to the Liberty server endpoint as defined by your environment.
- If you modify any pom.xml or plugin versions, re-run the build to ensure all components compile against the updated configuration.