mcp-multi -anthropic-chatbot
MCP server from Armiyants/mcp-multi-server-anthropic-chatbot
claude mcp add --transport stdio armiyants-mcp-multi-server-anthropic-chatbot uvx run mcp_chatbot.py \ --env ANTHROPIC_API_KEY="your_api_key_here"
How to use
This MCP server is a multi-server chatbot designed to help you discover and analyze arXiv research papers using natural language queries. It connects to multiple MCP servers simultaneously, exposing a range of tools such as paper search, content extraction, and topic-based browsing. You can issue natural language prompts to search for papers, fetch and summarize contents, and leverage pre-built prompts to structure complex analyses. The system integrates a research-oriented server for arXiv interactions, a fetch server for web content retrieval, and a filesystem server for managing files and resources. Use the provided commands to interact with these servers and to access organized paper collections by topic.
How to install
Prerequisites:
- Python 3.8+ installed on your system
- A virtual environment tool (optional but recommended)
- Access to an Anthropic API key
- Create and activate a virtual environment (optional but recommended):
python3 -m venv .venv
source .venv/bin/activate
- Install dependencies listed in requirements.txt:
pip install -r requirements.txt
- Set up your API key for Anthropic in an environment variable or a .env file:
export ANTHROPIC_API_KEY=your_api_key_here
- Run the MCP chatbot server using the uv runner (as shown in the README):
uvx run mcp_chatbot.py
- Optional: if using a script-managed environment, ensure your environment variables are loaded before starting the server.
Notes:
- The server relies on multiple components described in the README (Research Server, Fetch Server, Filesystem Server). Ensure those resources are accessible when testing.
- If you encounter permission or network issues, verify your API keys and network access to arXiv and any external services.
Additional notes
Tips and considerations:
- The Anthropic API key is required for the anthropic-based prompts and analyses.
- The system uses a multi-server setup; ensure all referenced servers are available and properly configured in server_config.json if you adapt or extend the configuration.
- Use the provided prompts and commands to search papers, browse topics, and generate analyses. For example, you can request a search prompt with topic=neural networks and a specific number of papers, then fetch and summarize the results.
- If you modify the codebase, keep an eye on the mcp_chatbot.py entrypoint to ensure compatibility with the uvx runner.
Related MCP Servers
mcp-vegalite
MCP server from isaacwasserman/mcp-vegalite-server
github-chat
A Model Context Protocol (MCP) for analyzing and querying GitHub repositories using the GitHub Chat API.
nautex
MCP server for guiding Coding Agents via end-to-end requirements to implementation plan pipeline
pagerduty
PagerDuty's official local MCP (Model Context Protocol) server which provides tools to interact with your PagerDuty account directly from your MCP-enabled client.
futu-stock
mcp server for futuniuniu stock
mcp -boilerplate
Boilerplate using one of the 'better' ways to build MCP Servers. Written using FastMCP