Get the FREE Ultimate OpenClaw Setup Guide →

mcts

Bayesian MCTS Model Context Protocol Server allowing Claude to control Ollama local models for Advanced MCTS and analysis.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio angrysky56-mcts-mcp-server uv run --directory /path/to/mcts-mcp-server/src/mcts_mcp_server server.py \
  --env PYTHONPATH="/path/to/mcts-mcp-server"

How to use

This MCP server exposes an Advanced Bayesian Monte Carlo Tree Search (MCTS) engine for AI-assisted analysis. Designed to work with Claude and similar LLMs, it enables multi-iteration analysis, Bayesian evaluation, Thompson sampling or UCT for node selection, and multi-LLM support (Ollama, OpenAI, Anthropic, Google Gemini). The server persists state across turns, classifies thoughts into philosophical approaches, and surfaces surprising directions to help you explore topics deeply. To use it, deploy the MCP server and connect via the MCP protocol; the system will automatically initialize and run MCTS across multiple iterations and simulations per iteration, returning the best analysis found during the search. When you request deep analysis, Claude will drive the MCTS process, iterate through multiple simulations, and then synthesize the final insights using the provided tooling.

How to install

Prerequisites:

  • Python 3.10+ installed
  • Access to a shell/terminal
  • UV (Astral UV) recommended for dependency resolution and speed

Installation steps:

  1. Clone the repository (replace with the actual repo URL):
git clone https://github.com/angrysky56-mcts-mcp-server.git
cd angrysky56-mcts-mcp-server
  1. Install UV if not already installed (preferred method):
# Install UV globally
curl -fsSL https://astral.sh/uv/install.sh | bash
  1. Set up and install dependencies using UV:
# Create and activate a virtual environment using UV
uv venv .venv
source .venv/bin/activate

# Install required Python packages
uv pip install -r requirements.txt
  1. Run the setup (if a setup.sh is provided):
./setup.sh
  1. Alternatively, run the server via UV directly (example):
uv run --directory /path/to/mcts-mcp-server/src/mcts_mcp_server server.py

Notes:

  • Ensure your environment variables (e.g., API keys) are set in a .env file at the repository root as described in the README.
  • If you prefer a manual setup without UV, you can install dependencies with your system Python and pip, but UV is recommended for dependency resolution.

Additional notes

Environment and configuration tips:

  • API keys for LLM providers should reside in a .env file at the repository root. Example keys include OPENAI_API_KEY, ANTHROPIC_API_KEY, and GEMINI_API_KEY.
  • You can customize the MCTS run via the provided tooling (initialize_mcts, run_mcts, set_active_llm, etc.). Parameters like max_iterations, simulations_per_iteration, and exploration_weight govern performance and depth of search; adjust them to balance speed and thoroughness.
  • If Claude Desktop integration is used, update the paths in claude_desktop_config.json to reflect your local server location.
  • For persistence across turns, ensure the state directory specified by the server is writable and backed up if needed.
  • If you encounter issues with model providers, verify the .env defaults (DEFAULT_LLM_PROVIDER, DEFAULT_MODEL_NAME) and the provider-specific configuration in your session.

Related MCP Servers

Sponsor this space

Reach thousands of developers