sindi-ai
Java Implementation of Anthropic's Model Context Protocol Server
claude mcp add --transport stdio sinditech-sindi-ai-mcp-server docker run -i sindi-ai-mcp-server:latest \ --env MCP_ENV="placeholder or description if needed"
How to use
The sindi-ai MCP server provides a Java-based implementation of the Model Context Protocol (MCP) for integrating AI model interactions within Jakarta EE applications. It exposes MCP features such as Tools, Prompts, Resource and Resource Template, allowing you to register these capabilities in your server via simple POJOs or CDI-managed beans. Depending on your choice of transport, the server can be exposed over REST (Jakarta REST) or Servlet (Jakarta Servlet) endpoints, including support for MCP SSE and Streamable HTTP transports. You can inspect and test these features using the MCP Inspector client after starting the server.
To use this server, first choose a transport runtime to include in your application (do not mix REST and Servlet runtimes in the same deployment due to potential endpoint conflicts). Then define your MCP Feature services by annotating methods with @Tool, @ToolArgument, @Prompt, @PromptArgument, @Resource, or @ResourceTemplate as appropriate. The Tools feature returns string results, Prompts can return PromptMessage or PromptMessage[], and Resources produce ResourceContents[] for binary or text content. Additionally, you can register feature managers (ToolManager, PromptManager, ResourceManager) to programmatically manage features, or inject MCPLogger and MCPContext for enhanced client-session interactions.
When you run the bundled examples, you’ll typically have two entry points: a REST-based server for RESTful MCP transport and a Servlet-based server for servlet transport. The MCP Inspector client (npx @modelcontextprotocol/inspector) can be launched from a separate terminal to connect to your running server and test the features via SSE (http://localhost:9080/sse) or Streamable HTTP (http://localhost:9080/mcp).
How to install
Prerequisites
- Java Development Kit (JDK 11 or newer)
- Maven ( mvn )
- Optional: Docker if you plan to run a pre-built image
Step 1: Clone the repository
git clone <repository-url>
cd sindi-ai-mcp-server
Step 2: Build the project
mvn clean package -e
Step 3: Choose a transport runtime and run an example
- REST runtime:
cd examples/sindi-ai-mcp-server-rest
mvn clean liberty:dev -e
- Servlet runtime:
cd examples/sindi-ai-mcp-server-servlet
mvn clean liberty:dev -e
Step 4: Run MCP Inspector to test
npx -y @modelcontextprotocol/inspector@latest
If you prefer a Docker-based start, ensure you have a built image and run:
docker run -i sindi-ai-mcp-server:latest
Step 5: Connect inspector to the server
- SSE: http://localhost:9080/sse
- Streamable HTTP: http://localhost:9080/mcp
Note: Do not mix REST and Servlet runtimes in the same deployment as there can be endpoint conflicts.
Additional notes
Tips and common considerations:
- Only enable one transport runtime in a single deployment to avoid endpoint conflicts.
- The MCP features rely on proper annotations (@Tool, @ToolArgument, @Prompt, @PromptArgument, @Resource, @ResourceTemplate). Ensure your feature methods return the expected types (String for tools, PromptMessage(s) for prompts, ResourceContents[] for resources).
- If you need programmatic control, inject the ToolManager, PromptManager or ResourceManager to register features at runtime.
- For production deployments, consider configuring logging via MCPLogger and ensure your environment provides appropriate security for client sessions.
- When testing locally, the MCP Inspector will connect to the SSE endpoint at /sse or the Streamable HTTP endpoint at /mcp. Ensure those endpoints are exposed by your runtime configuration (REST or Servlet).
- Check the documentation for module-specific setup: the core modules (spi, runtime, features) and the two runtime implementations (runtime-rest and runtime-servlet) have separate configuration options and examples.