Get the FREE Ultimate OpenClaw Setup Guide →

comfyui_LLM_party

LLM Agent Framework in ComfyUI includes MCP sever, Omost,GPT-sovits, ChatTTS,GOT-OCR2.0, and FLUX prompt nodes,access to Feishu,discord,and adapts to all llms with similar openai / aisuite interfaces, such as o1,ollama, gemini, grok, qwen, GLM, deepseek, kimi,doubao. Adapted to local llms, vlm, gguf such as llama-3.3 Janus-Pro, Linkage graphRAG

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio heshengtao-comfyui_llm_party uvx heshengtao-comfyui_llm_party \
  --env BASE_URL="Optional: API base URL for MCP API endpoints (ends with /v1/)"

How to use

comfyui_LLM_party is a complete MCP-enabled server that provides a collection of ComfyUI-based LLM workflow tools, enabling you to assemble and manage complex LLM-powered pipelines. The project exposes a suite of nodes and workflows that let you call local or API-driven LLMs, integrate with Ollama, manage local models, and connect to external services (SD prompts, image hosting, and more). Use the included workflows to quickly build multi-tool agents, vector-based RAG setups, and graph-style interactions between agents, all within the ComfyUI frontend. To get started, install the MCP server via the recommended runtime (uvx in this setup), then run the server to register the comfyui_llm_party package. Once running, load the provided workflows into your comfyui instance and use the party manager/plugin to install or enable missing nodes as needed. The server also introduces streaming API outputs and reasoning content separation for LLM API calls, enhancing observability during long-running requests and enabling more transparent reasoning traces.

How to install

Prerequisites:

  • Python 3.8+ (or the environment required by your selected MCP runtime)
  • Access to install Python-based MCP runtimes (e.g., uvx via pipx)

Installation steps:

  1. Install the MCP runtime (uvx):
    • If you use pipx: pipx install uvx
    • If you prefer pip: pip install uvx
  2. Install the comfyui_llm_party MCP package via the MCP runtime:
    • uvx run heshengtao-comfyui_llm_party
  3. Start the MCP server (the runtime will resolve and run the specified package and expose the mcp_server endpoints).
  4. Open your MCP control panel or the corresponding UI in ComfyUI to verify the party nodes are registered and install any missing dependencies via the included workflows:
    • Load: workflow/start_with_LLM_api.json
    • Load: workflow/start_with_aisuite.json
    • Load: workflow/start_with_Ollama.json
    • Load: workflow/start_with_llm_local.json
    • Load: workflow/start_with_GGUF.json
    • Load: workflow/start_with_VLM_local.json
    • Load: workflow/start_with_VLM_GGUF.json
    • Load: workflow/start_with_VLM_API_for_SD.json
    • Load: workflow/start_with_ollama_minicpm_for_SD.json
    • Load: workflow/start_with_qwen_vl_local_for_SD.json

Notes:

  • If you are behind a proxy or firewall, configure network settings accordingly so that the runtime can fetch models and download required dependencies.
  • If you are using Ollama or local models, ensure the corresponding services are running and accessible from the MCP runtime.
  • For API-based usage, fill in base_url (ending with /v1/) and api_key in the LLM loader node within ComfyUI.

Additional notes

Tips and common issues:

  • Ensure your environment variables for API access and local model paths are set in the node configurations inside ComfyUI.
  • If you encounter missing node errors after loading workflows, install missing nodes via comfyui-Manager as indicated in the Quick Start steps.
  • The MCP tool configuration file (mcp_config.json) can be edited to point to different MCP servers or package names if you host forks or alternate builds.
  • The streaming output feature helps monitor long-running API calls in real time; enable it in the API LLM loader node settings.
  • If you switch between local and API-based LLMs, remember to toggle is_ollama accordingly and adjust base_url and api_key fields as needed.

Related MCP Servers

Sponsor this space

Reach thousands of developers