Get the FREE Ultimate OpenClaw Setup Guide →

mcp-chatbot

MCP Chatbot powered by Anthropic Claude. Delivering on‐demand literature search and summarisation for academics and engineers

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio mctrinh-mcp-chatbot python research_server.py \
  --env PAPER_DIR="./papers" \
  --env RESEARCH_PORT="8001" \
  --env ANTHROPIC_MODEL="claude-3-opus-20240229"

How to use

mcp-chatbot is a modular MCP server that bridges Claude 3 with the Model Context Protocol to perform on-demand literature search and summarisation. It exposes a research server that can fetch, summarize, and reason over academic papers, with the chatbot interface orchestrating Claude prompts and MCP tool calls. Use it to run literature searches, extract key findings, and generate concise summaries or deeper analyses. The server is intended to be run locally (via Python) or in Docker, with port 8001 for the MCP research services and port 8000 for the CLI interface when running in Docker.

You can interact with the server in two ways:

  • REPL/CLI: Start the research server, then launch the chatbot CLI to issue commands or queries. Inside the REPL you can list Claude prompts, view downloaded paper topics, and issue queries that Claude will route to tools as needed. Example commands include listing prompts and folders, and performing AI-assisted queries.
  • One-shot: Use a single command to perform a targeted query, for example asking for the latest trends in a topic. The system will search papers, extract relevant information, and return a structured response.

Tools and capabilities include:

  • Paper search and topic-based retrieval from the papers directory.
  • Summarisation and extraction via Claude 3, orchestrated through MCP prompts.
  • Vector search over stored papers (intended future support with Faiss/Chroma).
  • CLI-driven interaction with a prompt-based workflow for quick questions or deeper investigations.

Configuration is controlled via environment variables and server_config.json. Typical usage uses ANTHROPIC_MODEL to select Claude 3, RESEARCH_PORT to expose the MCP research server, and PAPER_DIR to specify the papers cache location.

How to install

Prerequisites:

  • Python 3.8+ installed
  • Access to pip
  • Optional: Docker (for containerized runs)

Step-by-step installation:

  1. Clone the repository git clone https://github.com/mctrinh/mcp-chatbot.git cd mcp-chatbot

  2. Create or activate a Python environment (recommended) python -m venv venv source venv/bin/activate # on macOS/Linux

    Windows: venv\Scripts\activate

  3. Install the package in editable mode (development mode) pip install -e .

  4. Install development extras (optional for testing) pip install -e .[dev]

  5. Run the server (non-Docker) uv pip install -e .[dev] # ensure dev extras installed python research_server.py

  6. Optional Docker-based run docker build -t mcp-chatbot:0.1 . docker run --rm -it -p 8001:8001 -p 8000:8000 mcp-chatbot:0.1

  7. Environment configuration (examples) export ANTHROPIC_MODEL="claude-3-opus-20240229" export RESEARCH_PORT=8001 export PAPER_DIR=./papers

Notes:

  • If you update code, reinstall in editable mode to reflect changes.
  • The CLI can be launched after starting the research server: mcp-chatbot run

Additional notes

Tips and common issues:

  • Ensure ANTHROPIC_MODEL is set to a valid Claude 3 variant supported by your API key.
  • RESEARCH_PORT must match the port exposed by the MCP research server; default commonly used is 8001.
  • PAPER_DIR should point to a writable directory for cached paper metadata; ensure permissions are correct.
  • If you see errors like Could not connect to server 'fetch' or 'filesystem': Method not found, verify that the MCP research server is running and reachable on the specified RESEARCH_PORT and that Docker/CLI networking is not blocking the connection.
  • For Docker users, map both required ports (e.g., 8001 and 8000) and ensure volumes for papers are mounted if you rely on local caches.
  • The project structure includes a Python-based CLI (mcp_chatbot.cli) and a core chatbot engine (mcp_chatbot.core); unit tests reside under tests/test_core.py and can be executed with pytest.

Related MCP Servers

Sponsor this space

Reach thousands of developers