Get the FREE Ultimate OpenClaw Setup Guide →

mcp-chain-of-draft-prompt-tool

MCP prompt tool applying Chain-of-Draft (CoD) reasoning - BYOLLM

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio brendancopley-mcp-chain-of-draft-prompt-tool npx -y mcp-chain-of-draft-prompt-tool

How to use

The MCP Chain of Draft (CoD) Prompt Tool transforms a regular prompt into a Chain of Draft (CoD) or Chain of Thought (CoT) format before sending it to your chosen language model. It supports Bring Your Own LLM (BYOL) setups, enabling integration with cloud providers like Anthropic Claude and OpenAI GPT models, as well as local models such as Ollama and other compatible local APIs. After the LLM processes the structured prompt, the tool post-processes the response to present a clear, concise result. This approach aims to improve reasoning quality, reduce token usage, and maintain accuracy across a range of tasks.

You can configure which LLM provider to use by setting environment variables (for example, ANTHROPIC_API_KEY for Claude, OPENAI_API_KEY for OpenAI, or MISTRAL_API_KEY for Mistral). For local models with Ollama, you can select the provider and model (e.g., MCP_LLM_PROVIDER=ollama and MCP_OLLAMA_MODEL=llama2). There are also options for custom local models by pointing the tool to a local model API (MCP_PROVIDER=custom and MCP_CUSTOM_LLM_ENDPOINT). These integrations enable flexible deployment across cloud and edge environments while maintaining the CoD/CoT reasoning workflow.

In practical use, you compose prompts as you normally would, the tool restructures them into efficient CoD/CoT steps, sends them to the LLM, and then formats the output into a usable result. This is particularly beneficial for tasks requiring explicit reasoning traces, domain-specific reasoning, or when you want to optimize token usage without sacrificing answer quality.

How to install

Prerequisites

  • Node.js 14+ (modern MCP tooling typically assumes Node.js environments; verify required version in repo docs)
  • npm or yarn
  • Access to the internet to install dependencies from npm

Installation steps

  1. Clone the repository (or install via npm if published): git clone https://github.com/brendancopley/mcp-chain-of-draft-prompt-tool.git cd mcp-chain-of-draft-prompt-tool

  2. Install dependencies: npm install

  3. Configure API keys and environment for your LLM provider (examples):

    Cloud provider keys

    export ANTHROPIC_API_KEY=your_key_here export OPENAI_API_KEY=your_key_here export MISTRAL_API_KEY=your_key_here

    Local models via Ollama (if using Ollama):

    export MCP_LLM_PROVIDER=ollama export MCP_OLLAMA_MODEL=llama2

    Custom local models (if applicable):

    export MCP_PROVIDER=custom export MCP_CUSTOM_LLM_ENDPOINT=http://localhost:your_port

  4. Run the server (examples shown for typical Node.js setup): npm start

  5. Optional: build Single Executable Applications (SEA) if you need standalone binaries: npm run build:sea

    or for platform-specific builds

    npm run build:macos npm run build:linux npm run build:windows

Notes:

  • The project uses Nx as its build system and supports SEA packaging via the nx-node-sea plugin.
  • You can also run development builds with auto-reload using npm run dev during active development.

Additional notes

Tips and common issues:

  • Ensure your LLM provider keys are kept secure and not committed to version control.
  • If using Ollama, make sure the Ollama daemon is running and the model you pull is compatible with your workflow.
  • For custom local models, verify the endpoint exposes a chat-completion style API compatible with the tool.
  • SEA builds produce standalone executables; test them in the target OS environment to verify dependencies are bundled correctly.
  • Check environment variable naming and casing, as typos can prevent the tool from selecting the intended LLM provider.
  • Refer to the README for the latest supported models and setup steps, as integrations evolve with new releases.

Related MCP Servers

Sponsor this space

Reach thousands of developers