quarkus-workshop-langchain4j
Quarkus Langchain4J Workshop
claude mcp add --transport stdio quarkusio-quarkus-workshop-langchain4j docker run -i quarkus-workshop-langchain4j:latest
How to use
This MCP server hosts the Quarkus LangChain4j workshop. It provides a Java-based application that showcases building AI-infused applications using Quarkus and LangChain4j. The project consists of multiple steps, each presenting a concrete scenario or integration (such as connecting LangChain4j to a reasoning or embedding service) and offering runnable code in the step directories. The application runs locally on port 8080, and you can follow the step-by-step instructions by exploring the step-XX folders. To run the workshop locally, start the server and then interact with the exposed endpoints to view the workshop content and the built-in tooling.
How to install
Prerequisites:
- Docker installed and running
- Basic familiarity with the command line
- Optional: Git to clone the repository
Installation steps:
-
Clone the repository (or download the repository that contains the workshop): git clone https://github.com/quarkusio/quarkus-workshop-langchain4j.git cd quarkus-workshop-langchain4j
-
Build and run the workshop using Docker (preferred if a prebuilt image is available):
- Ensure Docker is running
- Start the workshop container: docker run -i quarkus-workshop-langchain4j:latest
-
Alternative: If you have the project locally and want to run with Maven (not via Docker):
- Ensure JDK 17+ is installed
- Use the Maven wrapper included in the project: ./mvnw quarkus:dev
- The application will be accessible at http://localhost:8080
Note: If you choose to run locally, you may need to configure environment variables for LangChain4j integration or external services as described in the step documentation.
Additional notes
Tips and considerations:
- The workshop is organized into step-XX directories. Each step contains its own runnable state and instructions.
- The default running port is 8080; if you need to change it, adjust the server configuration in the step docs or via environment variables as described.
- If you encounter missing dependencies or connection issues with external AI services, verify network access and proxy settings.
- For Docker runs, ensure you have permission to pull images from the registry and that there’s sufficient disk space for the image.
- The LangChain4j integration relies on the appropriate AI model endpoints; check the step docs for required API keys or local mock endpoints.
- When updating the workshop, you may want to rebuild the Docker image or re-run via Maven to pick up changes in the step directories.
Related MCP Servers
jadx-ai
Plugin for JADX to integrate MCP server
nodit
A Model Context Protocol (MCP) server for AI agents to interact with blockchain data via Nodit’s Web3 Data and Node APIs. Enables LLMs to access structured, multi-chain blockchain context with zero blockchain-specific logic.
mcp -templates
A flexible platform that provides Docker & Kubernetes backends, a lightweight CLI (mcpt), and client utilities for seamless MCP integration. Spin up servers from templates, route requests through a single endpoint with load balancing, and support both deployed (HTTP) and local (stdio) transports — all with sensible defaults and YAML-based configs.
alris
Alris is an AI automation tool that transforms natural language commands into task execution.
AI-web mode
一个基于 MCP (Model Context Protocol) 的智能对话助手Web应用,支持实时聊天、工具调用和对话历史管理。
glyph
🔮 glyph – mcp server to map your codebase with symbol outlines