chain-of-thought
An mcp server to inject raw chain of thought tokens from a reasoning model.
claude mcp add --transport stdio beverm2391-chain-of-thought-mcp-server uv --directory path/to/cot-mcp-server run src/server.py \ --env GROQ_API_KEY="your-groq-api-key"
How to use
This MCP server provides a chain-of-thought enabled interface by routing requests through Groq's API to access the qwq model's raw chain-of-thought tokens. It’s designed to be queried by an agent that can leverage the generated reasoning stream to improve decision-making and response quality. The server is invoked via uv (Python) and expects a local repository path to the cot-mcp-server and a running Groq API key. When used, the think-style material (chain-of-thought) is generated as part of the model’s internal scratchpad and can be consumed by clients that support streaming or tokenized outputs. Use cases include complex multi-step reasoning, plan formation, and transparent reasoning traces for auditing responses. To enable this in your agent, configure the tool usage to call the chain_of_thought MCP tool for each request, so the agent can reason through steps before producing a final answer.
How to install
Prerequisites:
- Python 3.8+ installed on your system
- uv (from uvx) tooling available to run Python-based MCP servers
- Access to a Groq API key (Groq Console) with permission to use the qwq model
- The repository for the Chain of Thought MCP Server checked out on your machine
Installation steps:
-
Clone the repository locally: git clone https://github.com/your-org/beverm2391-chain-of-thought-mcp-server.git cd beverm2391-chain-of-thought-mcp-server
-
Install dependencies via uv sync (as per the README): uv sync
or ensure your Python environment has the required packages installed (e.g., via requirements.txt if provided)
-
Obtain a Groq API key:
- Go to the Groq Console and create/manage an API key: https://console.groq.com/keys
-
Configure the MCP server for your environment:
- Update the mcp configuration in your environment to include the chain_of_thought server details and your GROQ_API_KEY.
- Example configuration (adjust path to your local repo as needed): { "mcpServers": { "chain_of_thought": { "command": "uv", "args": [ "--directory", "path/to/cot-mcp-server", "run", "src/server.py" ], "env": { "GROQ_API_KEY": "your-groq-api-key" } } } }
-
Run the MCP server:
- Ensure you are in the repository root and then start the server using your configured mcp tool (as described in step 4). The server should be listening for MCP requests via the specified command.
-
Verify connectivity:
- Send a test MCP request to the chain_of_thought endpoint and confirm that a chain-of-thought stream can be retrieved through Groq's API.
Additional notes
Tips and considerations:
- The GROQ_API_KEY must be kept secure; avoid hard-coding it in public configurations. Use environment secret management where possible.
- The path in the command arguments should point to the local clone of this repository (or the appropriate path where the cot-mcp-server lives).
- If you encounter authentication errors with Groq, re-generate the API key and ensure it has permission to call the qwq model endpoints.
- This MCP server exposes chain-of-thought data; ensure your usage complies with safety and policy considerations for chain-of-thought leakage and model rationales.
- If you need to pass additional environment variables (e.g., model endpoints, proxy settings), extend the env map in the mcp_config accordingly.
Related MCP Servers
mcp-vegalite
MCP server from isaacwasserman/mcp-vegalite-server
github-chat
A Model Context Protocol (MCP) for analyzing and querying GitHub repositories using the GitHub Chat API.
nautex
MCP server for guiding Coding Agents via end-to-end requirements to implementation plan pipeline
pagerduty
PagerDuty's official local MCP (Model Context Protocol) server which provides tools to interact with your PagerDuty account directly from your MCP-enabled client.
futu-stock
mcp server for futuniuniu stock
mcp -boilerplate
Boilerplate using one of the 'better' ways to build MCP Servers. Written using FastMCP