Get the FREE Ultimate OpenClaw Setup Guide →

quarkus -basic-tool

MCP server from burrsutter/quarkus-mcp-server-basic-tool

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio burrsutter-quarkus-mcp-server-basic-tool jbang --java 21 --quiet org.acme:basic-tool:1.0.0-SNAPSHOT:runner \
  --env OPENAI_API_KEY="your_openai_api_key_here"

How to use

This MCP server provides a basic tool runner built with the Quarkus-based MCP example. It is invoked via the jbang entry point org.acme:basic-tool:1.0.0-SNAPSHOT:runner and is designed to be used in conjunction with the MCP CLI (mcp-cli) and the uv runtime for orchestration. To run, ensure you have the required prerequisites installed, configure your OpenAI API key, and then start the server named burrhello. The server is intended to be used with the MCP tooling to perform tasks or workflows defined by the basic-tool runner, with OpenAI models available through the provider integration configured in your mcp-cli setup.

After starting, you can manage and interact with the MCP server via the mcp-cli commands and UV runtime, enabling you to provision, sync, and run tasks against the burrhello server using the configured OpenAI model (e.g., GPT-4o-mini).

How to install

Prerequisites:

  • Java 21 JDK (or a compatible JDK for the jbang runner)
  • Maven wrapper (mvnw) if you’re building from source
  • jbang (for running the tool via the provided entry point)
  • uv (for the MCP runtime/CLI interactions)
  • Git

Installation steps:

  1. Install jbang (if not already installed):

  2. Install UV (if not already installed):

    • macOS/Linux: brew install uv
  3. Clone the MCP server repository and build the project (example uses Maven wrapper):

    • git clone <repository-url>
    • cd <project-directory>
    • ./mvnw install
  4. Install and configure the MCP CLI (mcp-cli) if you haven’t:

  5. Set your OpenAI API key (required for the OpenAI provider integration):

    • export OPENAI_API_KEY=your_openai_api_key
  6. Configure the MCP server in mcp_config.json (example shown in the README):

    • Ensure the mcpServers section contains the burrhello server with the jbang runner
  7. Run the MCP server via UV using the configured server name:

    • uv run mcp-cli --server burrhello --provider openai --model gpt-4o-mini

Additional notes

Tips and notes:

  • Ensure OPENAI_API_KEY is exported in the environment where UV and the MCP server run, as the provider relies on this key for API calls.
  • The burrhello server uses jbang to execute the Quarkus-based basic-tool runner. If you encounter classpath or Java version issues, verify that the Java version matches the one used to build the tool (Java 21 in the example).
  • If you repackage or update the basic-tool, remember to adjust the version in the Maven coordinates (org.acme:basic-tool:1.0.0-SNAPSHOT:runner).
  • When using mcp-cli with OpenAI, ensure the model parameter you pass (e.g., gpt-4o-mini) matches the capabilities available under your OpenAI subscription and provider configuration.
  • If you see connectivity or runtime errors in UV, check your environment variables, network access to OpenAI, and that the burrhello server definition in mcp_config.json is correctly referenced by the mcp-cli command.

Related MCP Servers

Sponsor this space

Reach thousands of developers