Hegelion
Dialectical reasoning architecture for LLMs (Thesis → Antithesis → Synthesis)
claude mcp add --transport stdio hmbown-hegelion python -m hegelion.server
How to use
Hegelion is a prompt-driven MCP server that applies dialectical reasoning to LLMs, forcing the model to argue with itself before reaching conclusions. It supports two modes: Dialectical Reasoning (thesis → antithesis → synthesis) for deep analysis and Autocoding (Player → Coach → Iterate) for structured, verified code generation. In MCP-enabled editors (like Claude Desktop, Cursor, or VS Code) you can trigger Hegelion prompts directly through MCP commands, or run the included server via the CLI. The server exposes a prompt-driven Python API and a CLI entry point (hegelion-server) for quick self-tests and interactions. When used via the editor, you can invoke autocode turns or dialectical prompts to iteratively refine outputs, with independent verification from the Coach role in the Autocoding workflow, improving reliability and reducing context pollution.
To use it, install the Python package and connect it to your MCP-enabled editor. The README describes the available MCP tools: dialectic (Unified dialectical reasoning with modes like single_shot, workflow, thesis, antithesis, synthesis), autocode (autocoding entrypoint with modes such as init, workflow, single_shot), autocode_turn (one-turn execution for player/coach/advance roles), and autocode_session (save/load sessions). You can also run a health check with hegelion-server --self-test to ensure the server and its tooling are functional. The project also includes a Codex skill that mirrors the /hegelion command and leverages MCP tools when available.
If you’re integrating into an editor, use the provided MCP setup commands (e.g., hegelion-setup-mcp --host claude-desktop or --host vscode) to connect the MCP client to the Hegelion server. You can also run the Python API directly to generate prompts or explore the dialectical prompt builder utilities, such as create_single_shot_dialectic_prompt, for custom workflows.
How to install
Prerequisites:
- Python 3.10+ installed on your system
- Internet access to install Python packages from PyPI
Steps:
-
Create and activate a Python virtual environment (optional but recommended): python -m venv venv source venv/bin/activate # on Unix or macOS venv\Scripts\activate # on Windows
-
Install Hegelion from PyPI: pip install hegelion
-
(Optional) Set up MCP integration for editors: hegelion-setup-mcp --host claude-desktop hegelion-setup-mcp --host cursor hegelion-setup-mcp --host vscode hegelion-setup-mcp --host windsurf
-
Run the Hegelion server (for local testing): hegelion-server --self-test
-
Start using the server via the MCP-enabled editor or via the Python API as described in the repository documentation.
Additional notes
Tips and notes:
- The server supports two primary modes: Dialectical Reasoning and Autocoding. Dialectical Reasoning performs three calls (thesis, antithesis, synthesis) to produce a synthesized answer. Autocoding uses a Player–Coach–Advance loop with explicit verification to ensure requirements are met.
- Use the health check (hegelion-server --self-test) to validate that the server and MCP tools are wired correctly before integrating with an editor.
- Editor integration requires enabling MCP support in your editor and selecting Hegelion as the server target via hegelion-setup-mcp.
- There is a Codex skill available at skills/hegelion/SKILL.md that mirrors the /hegelion command and leverages MCP tools when available.
- If you encounter issues with environment variables, ensure your Python environment is active and that the editor’s MCP client can reach the local server (default port and host settings may be configurable in your setup).
- The Python API example shown in the README demonstrates how to generate a dialectic prompt programmatically using create_single_shot_dialectic_prompt.
Related MCP Servers
SearChat
Search + Chat = SearChat(AI Chat with Search), Support OpenAI/Anthropic/VertexAI/Gemini, DeepResearch, SearXNG, Docker. AI对话式搜索引擎,支持DeepResearch, 支持OpenAI/Anthropic/VertexAI/Gemini接口、聚合搜索引擎SearXNG,支持Docker一键部署。
ollama
An MCP Server for Ollama
mcp-llm
An MCP server that provides LLMs access to other LLMs
mem0 -selfhosted
Self-hosted mem0 MCP server for Claude Code. Run a complete memory server against self-hosted Qdrant + Neo4j + Ollama while using Claude as the main LLM.
claude-vigil
🏺 An MCP server for checkpointing and file recovery in Claude Code
mcp-client-gen
Turn any MCP server into a type-safe TypeScript SDK in seconds - with OAuth 2.1 and multi-provider support