ibm-odm-decision
An MCP Server enabling integration with IBM Decision Server Runtime to retrieve and invoke decision services.
claude mcp add --transport stdio decisionsdev-ibm-odm-decision-mcp-server uvx --from git+https://github.com/DecisionsDev/ibm-odm-decision-mcp-server start --url http://localhost:9060/res
How to use
This MCP server exposes IBM ODM decisions (rulesets) as accessible tools for AI assistants and orchestration platforms. By using the Decision MCP Server, you can store ODM resources locally, and configure authentication to securely access the RES console and the Decision Server Runtime. The server is designed to integrate with Watson Orchestrate, Claude Desktop, Cursor AI, and other automation tools, allowing decisions to be invoked as tools within workflows. With the built‑in storage and flexible authentication options (Zen API Key, Basic Auth, OpenID Connect), you can centrally manage decision logic and expose it to end users and bots as reusable capabilities.
To use it, start the MCP server and point your client (AI assistant or workflow orchestrator) to the server URL (for example, http://localhost:9060/res). The server will handle authentication based on your environment (Cloud Pak, Kubernetes/OpenShift, or Docker/Local) and will support multiple authentication schemes if you integrate both the RES Console and the Runtime with compatible credentials. When integrated with Claude Desktop or Watson Orchestrate, you can present ODM decisions as tools that can be invoked with specific parameters, enabling dynamic decision execution within larger automation pipelines.
How to install
Prerequisites:
- Python 3.13 or higher
- uv (see installation options below)
Option A: Install and run with uv (recommended)
-
Install uv: macOS/Linux: curl -LsSf https://astral.sh/uv/install.sh | sh
Windows: powershell -c "irm https://astral.sh/uv/install.ps1 | iex"
Alternative (via pip): pip install uv
-
Run the MCP server directly from GitHub using uvx: uvx --from git+https://github.com/DecisionsDev/ibm-odm-decision-mcp-server start --url http://localhost:9060/res The uvx command will automatically download and install the package, manage dependencies, and start the server.
Option B: Install via uv from a local environment
- Ensure uv is installed (see Option A).
- Install the package from GitHub and start as shown above.
Prerequisites for ODM integration depend on your deployment environment (Cloud Pak for Business Automation, Kubernetes/OpenShift, or Docker/Local). Review the Configuration section for authentication methods appropriate to your environment.
Additional notes
Tips and common considerations:
- Authentication options vary by deployment: Zen API Key for Cloud Pak, Basic Auth or OpenID Connect (with Client Secret or PKJWT) for Kubernetes/Cloud deployments. You can configure both RES Console and Runtime authentication types if needed, but both must be supported by the chosen credentials.
- When using OpenID Connect, you may provide CLIENT_ID, CLIENT_SECRET, TOKEN_URL (and optionally SCOPE) via CLI or environment variables (see the Configuration section in the README).
- The server exposes ODM decisions as tools; you can pass parameters as part of the tool invocation in your automation workflow.
- If you run in a docker/local environment, ensure network access between the MCP server, ODM RES Console, and the Decision Server Runtime.
- Ensure the ODM resources (decisions/rulesets) you intend to expose are accessible to the MCP server's configured credentials.
- The example URL http://localhost:9060/res is a common default; adjust to your actual deployment URL if different.
Related MCP Servers
Windows
MCP Server for Computer Use in Windows
jupyter
🪐 🔧 Model Context Protocol (MCP) Server for Jupyter.
edumcp
EDUMCP is a protocol that integrates the Model Context Protocol (MCP) with applications in the education field, dedicated to achieving seamless interconnection and interoperability among different AI models, educational applications, smart hardware, and teaching AGENTs.
lihil
2X faster ASGI web framework for python, offering high-level development, low-level performance.
hayhooks
Easily deploy Haystack pipelines as REST APIs and MCP Tools.
gemini -client
A MCP (Model Context Protocol) client that uses Google Gemini AI models for intelligent tool usage and conversation handling. Tested working nicely with Claude Desktop as an MCP Server currently. Based on untested AI gen code by a non-coder use at own risk.