semantic-scholar-fastmcp
A FastMCP server implementation for the Semantic Scholar API, providing comprehensive access to academic paper data, author information, and citation networks.
claude mcp add --transport stdio zongmin-yu-semantic-scholar-fastmcp-mcp-server python run.py \ --env SEMANTIC_SCHOLAR_API_KEY="optional"
How to use
This MCP server implements a FastMCP-based bridge to the Semantic Scholar API. It exposes a modular, scalable set of endpoints for papers, authors, and recommendations, along with batch and bulk operations, rate-limiting, and support for both authenticated and unauthenticated access. The server is organized around a central FastMCP instance and a public HTTP bridge that provides REST-like access to common workflows, enabling Claude Desktop (or other MCP consumers) to query papers, authors, and recommendations efficiently.
To use it, start the server with Python (running the provided run.py entry point). Once the server is up, you can access the built-in HTTP bridge (default port 8000) or run MCP tools directly through your FastMCP client. The bridge reuses the same HTTP utilities as the MCP tools, ensuring consistent rate limiting, API key handling, and connection pooling. If you have a Semantic Scholar API key, you can add it to the environment so higher rate limits apply; otherwise, the server will operate in unauthenticated mode with lower quotas.
How to install
Prerequisites
- Python 3.8+
- FastMCP framework installed in your environment
- Access to repo (clone or download)
Manual installation steps
- Clone the repository:
git clone https://github.com/YUZongmin/semantic-scholar-fastmcp-mcp-server.git
cd semantic-scholar-server
- Install dependencies (FastMCP and any Python requirements, as documented by the project). If a requirements file exists, use:
python -m pip install -r requirements.txt
- Configure environment (optional Semantic Scholar API key):
export SEMANTIC_SCHOLAR_API_KEY=your-api-key-here
- Run the server (entry point is run.py):
python run.py
Smithery installation (optional) If you want to install via Smithery for automatic client handling, use:
npx -y @smithery/cli install semantic-scholar-fastmcp-mcp-server --client claude
Then configure Claude Desktop to point at the server path shown in your Smithery output.
Additional notes
Environment variables:
- SEMANTIC_SCHOLAR_API_KEY: Optional API key for higher rate limits. If omitted, the server uses unauthenticated access with lower quotas.
HTTP Bridge:
- Default listening port: 8000
- You can adjust via SEMANTIC_SCHOLAR_ENABLE_HTTP_BRIDGE, SEMANTIC_SCHOLAR_HTTP_BRIDGE_HOST, and SEMANTIC_SCHOLAR_HTTP_BRIDGE_PORT in the environment.
Endpoints exposed by the built-in HTTP bridge (accessible at http://localhost:8000):
- GET /v1/paper/search?q=... (fields, offset, limit)
- GET /v1/paper/{paper_id} (fields)
- POST /v1/paper/batch (JSON: {"ids": [...]})
- GET /v1/author/search?q=... (fields, offset, limit)
- GET /v1/author/{author_id} (fields)
- POST /v1/author/batch (JSON: {"ids": [...]})
- GET /v1/recommendations?paper_id=...
Common issues and tips:
- Ensure your Python environment has access to the required network resources (Semantic Scholar API) and that any firewalls permit outbound HTTP/HTTPS.
- If you encounter rate-limit errors, consider supplying SEMANTIC_SCHOLAR_API_KEY or enabling higher-rate limits as documented in the API key section.
- When upgrading, review the modular structure (semantic_scholar package) to understand how papers, authors, and recommendations endpoints are wired to the MCP core.
Related MCP Servers
PPTAgent
An Agentic Framework for Reflective PowerPoint Generation
open-ptc-agent
An open source implementation of code execution with MCP (Programatic Tool Calling)
TradingAgents mode
TradingAgents-MCPmode 是一个创新的多智能体交易分析系统,集成了 Model Context Protocol (MCP) 工具,实现了智能化的股票分析和交易决策流程。系统通过多个专业化智能体的协作,提供全面的市场分析、投资建议和风险管理。
ultimate_mcp_server
Comprehensive MCP server exposing dozens of capabilities to AI agents: multi-provider LLM delegation, browser automation, document processing, vector ops, and cognitive memory systems
lc2mcp
Convert LangChain tools to FastMCP tools
modelscope
ModelScope's official MCP Server (in active development).