ReasoningBank
Implementation based on the paper "ReasoningBank: Scaling Agent Self-Evolving with Reasoning Memory"
claude mcp add --transport stdio hanw39-reasoningbank-mcp python -m src.server \ --env DASHSCOPE_API_KEY="your-dashscope-api-key-or-placeholder"
How to use
ReasoningBank MCP Server provides memory-augmented reasoning capabilities for AI agents. It exposes a memory retrieval and extraction workflow via the MCP protocol, allowing agents to fetch past experiences to guide current tasks and to save new experiences after task completion. The server supports multiple transport modes (STDIO by default for desktop integrations and SSE for live web-like interactions) and multiple LLM/embedding backends through a pluggable architecture. Typical usage involves starting the MCP server and configuring an MCP client to point at the server endpoint, then invoking retrieve_memory at task start or direction changes and extract_memory at task end or after failures to build a growing repository of reasoning experiences. Tools named retrieve_memory and extract_memory are the core MCP tools: retrieve_memory helps fetch relevant past memories to inform decisions, while extract_memory saves the current task trajectory as memory entries for future reuse. The system emphasizes de-duplication, merging, and archiving to optimize storage and enable auditability across agents and sub-agents. The server supports multi-tenant isolation via agent_id, and it can work with various memory backends and LLM providers through its modular design. To get started, run the server and configure your MCP client with either a minimal STDIO setup or an SSE-based setup for real-time streaming responses.
How to install
Prerequisites:
- Python 3.8+ and pip
- Git
Install the MCP server in editable mode from the repository:
git clone https://github.com/hanw39/ReasoningBank-MCP.git
cd ReasoningBank-MCP
# Install Python dependencies in editable mode
pip install -e .
Option A: STDIO workflow (local development or desktop clients)
- No extra server arguments are required beyond proper environment variables if needed.
Option B: SSE workflow (web-like or desktop clients that connect via HTTP streaming)
- Start the server in SSE mode (example shown below) and configure the client to point to the SSE endpoint.
# Start SSE transport (defaults: 127.0.0.1:8000)
python3 -m src.server --transport sse
# Or specify host/port
python3 -m src.server --transport sse --host 0.0.0.0 --port 8080
Client configuration examples:
- STDIO mode (inline in the client configuration):
{
"mcpServers": {
"reasoning-bank": {
"command": "reasoning-bank-mcp",
"env": {
"DASHSCOPE_API_KEY": "你的百炼APIKEY"
}
}
}
}
- SSE mode: configure the client to connect to the SSE endpoint:
{
"mcpServers": {
"reasoning-bank": {
"url": "http://127.0.0.1:8000/sse"
}
}
}
Notes:
- If you need DashScope or other LLM providers, set up the llm and embedding backends in your config as described in the repository docs.
- You may install additional dependencies or adjust environment variables (e.g., API keys, memory store paths) as needed for your environment.
Additional notes
- Ensure agent_id is consistently provided for multi-tenant isolation (SubAgent scenarios recommended).
- The system supports memory deduplication, merging, and archiving (v0.2.0+). Configure these features in your memory_manager settings if you want automated deduplication and merging behavior.
- For SSE transport, ensure firewall and network rules allow HTTP traffic to the configured host/port.
- When configuring retrieval thresholds, tune min_score_threshold and hybrid strategy weights to balance recall and precision for your data domain.
- If memory extraction runs asynchronously, monitor task_id/status to confirm completion and consider synchronous extraction for critical or auditing tasks.
Related MCP Servers
web-eval-agent
An MCP server that autonomously evaluates web applications.
skillz
An MCP server for loading skills (shim for non-claude clients).
fhir
FHIR MCP Server – helping you expose any FHIR Server or API as a MCP Server.
awsome_kali_MCPServers
awsome kali MCPServers is a set of MCP servers tailored for Kali Linux
unitree-go2
The Unitree Go2 MCP Server is a server built on the MCP that enables users to control the Unitree Go2 robot using natural language commands interpreted by a LLM.
finance
LLM-powered MCP server for building financial deep-research agents, integrating web search, Crawl4AI scraping, and entity extraction into composable analysis flows.