mcp-chain-of-draft-prompt-tool
MCP prompt tool applying Chain-of-Draft (CoD) reasoning - BYOLLM
claude mcp add --transport stdio brendancopley-mcp-chain-of-draft-prompt-tool npx -y mcp-chain-of-draft-prompt-tool
How to use
The MCP Chain of Draft (CoD) Prompt Tool transforms a regular prompt into a Chain of Draft (CoD) or Chain of Thought (CoT) format before sending it to your chosen language model. It supports Bring Your Own LLM (BYOL) setups, enabling integration with cloud providers like Anthropic Claude and OpenAI GPT models, as well as local models such as Ollama and other compatible local APIs. After the LLM processes the structured prompt, the tool post-processes the response to present a clear, concise result. This approach aims to improve reasoning quality, reduce token usage, and maintain accuracy across a range of tasks.
You can configure which LLM provider to use by setting environment variables (for example, ANTHROPIC_API_KEY for Claude, OPENAI_API_KEY for OpenAI, or MISTRAL_API_KEY for Mistral). For local models with Ollama, you can select the provider and model (e.g., MCP_LLM_PROVIDER=ollama and MCP_OLLAMA_MODEL=llama2). There are also options for custom local models by pointing the tool to a local model API (MCP_PROVIDER=custom and MCP_CUSTOM_LLM_ENDPOINT). These integrations enable flexible deployment across cloud and edge environments while maintaining the CoD/CoT reasoning workflow.
In practical use, you compose prompts as you normally would, the tool restructures them into efficient CoD/CoT steps, sends them to the LLM, and then formats the output into a usable result. This is particularly beneficial for tasks requiring explicit reasoning traces, domain-specific reasoning, or when you want to optimize token usage without sacrificing answer quality.
How to install
Prerequisites
- Node.js 14+ (modern MCP tooling typically assumes Node.js environments; verify required version in repo docs)
- npm or yarn
- Access to the internet to install dependencies from npm
Installation steps
-
Clone the repository (or install via npm if published): git clone https://github.com/brendancopley/mcp-chain-of-draft-prompt-tool.git cd mcp-chain-of-draft-prompt-tool
-
Install dependencies: npm install
-
Configure API keys and environment for your LLM provider (examples):
Cloud provider keys
export ANTHROPIC_API_KEY=your_key_here export OPENAI_API_KEY=your_key_here export MISTRAL_API_KEY=your_key_here
Local models via Ollama (if using Ollama):
export MCP_LLM_PROVIDER=ollama export MCP_OLLAMA_MODEL=llama2
Custom local models (if applicable):
export MCP_PROVIDER=custom export MCP_CUSTOM_LLM_ENDPOINT=http://localhost:your_port
-
Run the server (examples shown for typical Node.js setup): npm start
-
Optional: build Single Executable Applications (SEA) if you need standalone binaries: npm run build:sea
or for platform-specific builds
npm run build:macos npm run build:linux npm run build:windows
Notes:
- The project uses Nx as its build system and supports SEA packaging via the nx-node-sea plugin.
- You can also run development builds with auto-reload using npm run dev during active development.
Additional notes
Tips and common issues:
- Ensure your LLM provider keys are kept secure and not committed to version control.
- If using Ollama, make sure the Ollama daemon is running and the model you pull is compatible with your workflow.
- For custom local models, verify the endpoint exposes a chat-completion style API compatible with the tool.
- SEA builds produce standalone executables; test them in the target OS environment to verify dependencies are bundled correctly.
- Check environment variable naming and casing, as typos can prevent the tool from selecting the intended LLM provider.
- Refer to the README for the latest supported models and setup steps, as integrations evolve with new releases.
Related MCP Servers
mcp-agent
Build effective agents using Model Context Protocol and simple workflow patterns
mcp-telegram
MCP Server for Telegram
neurolink
Universal AI Development Platform with MCP server integration, multi-provider support, and professional CLI. Build, test, and deploy AI applications with multiple ai providers.
mcp-chain-of-draft
Chain of Draft Server is a powerful AI-driven tool that helps developers make better decisions through systematic, iterative refinement of thoughts and designs. It integrates seamlessly with popular AI agents and provides a structured approach to reasoning, API design, architecture decisions, code reviews, and implementation planning.
goai
AI SDK for building AI-powered applications in Go
MCP-Plugin-dotnet
.NET MCP bridge: expose app methods/data as MCP tools, prompts, and resources via an in-app plugin + lightweight server (SignalR; stdio/http).