mcp-zenodo
Tool-based LLM integration with Zenodo via the Model Context Protocol (MCP)
claude mcp add --transport stdio mskazemi-mcp-zenodo uvx mcp_api
How to use
Zenodo MCP provides two complementary implementations to interact with Zenodo records via the Model Context Protocol. The MCP API implementation exposes a FastAPI-based service that can be integrated with LangChain, LangGraph, and OpenAI-compatible clients, enabling search, retrieval, metadata access, citations, and file downloads as MCP tools. The MCP SDK Core offers a Python-based MCP server designed for direct integration with development environments like Cursor IDE, providing MCP-compliant access to Zenodo data without the need for extra middleware. Together, they let you either embed Zenodo capabilities into AI workflows (via the API) or run a lightweight, environment-embedded MCP service (via the SDK Core).
How to install
Prerequisites:
- Python 3.9+ and pip
- Git
- (Optional) Node.js if you plan to use additional tooling, but not required for the Python implementations
Install and set up both implementations:
-
Clone the repository
-
Install and run the MCP SDK Core (Cursor IDE integration)
cd zenodo-mcp/mcp_sdk_core python -m venv venv source venv/bin/activate # on Windows use venv\Scripts\activate pip install -r requirements.txt
Follow the SDK Core README for Cursor-specific configuration (mcp.json) and integration steps.
-
Install and run the MCP API (LangChain/LangGraph/OpenAI-compatible API)
cd zenodo-mcp/mcp_api python -m venv venv source venv/bin/activate pip install -r requirements.txt
Set up environment variables (create .env from .env.example and provide your Zenodo API token)
cp .env.example .env nano .env # or your preferred editor
Run the API server
uvicorn server.main:app --host 0.0.0.0 --port 8000
Notes:
- If you intend to expose the API to other services, you may run behind a reverse proxy or containerize the application.
- Ensure your Zenodo API token has appropriate scopes for the operations you plan to perform.
Additional notes
Tips and common issues:
- Ensure Python virtual environments are activated when installing dependencies for each component.
- If you encounter port conflicts, change the port in the uvicorn command (e.g., --port 8001).
- For the API, store sensitive tokens in a dedicated environment file (.env) and do not commit it to version control.
- The SDK Core is intended for Cursor IDE integration; refer to the mcp_sdk_core/README.md for Cursor-specific configuration nuances.
- If you must run in Docker, convert the python-based components into appropriate container images and adjust the mcp_config accordingly (docker command format).
Related MCP Servers
CoexistAI
CoexistAI is a modular, developer-friendly research assistant framework . It enables you to build, search, summarize, and automate research workflows using LLMs, web search, Reddit, YouTube, and mapping tools—all with simple MCP tool calls or API calls or Python functions.
fullstack-langgraph-nextjs-agent
Production-ready Next.js template for building AI agents with LangGraph.js. Features MCP integration for dynamic tool loading, human-in-the-loop tool approval, persistent conversation memory with PostgreSQL, and real-time streaming responses. Built with TypeScript, React, Prisma, and Tailwind CSS.
MCP-MultiServer-Interoperable-Agent2Agent-LangGraph-AI-System
This project demonstrates a decoupled real-time agent architecture that connects LangGraph agents to remote tools served by custom MCP (Modular Command Protocol) servers. The architecture enables a flexible and scalable multi-agent system where each tool can be hosted independently (via SSE or STDIO), offering modularity and cloud-deployable execut
ToolRAG
Unlimited LLM tools, zero context penalties — ToolRAG serves exactly the LLM tools your user-query demands.
ai-learning
AI Learning: A comprehensive repository for Artificial Intelligence and Machine Learning resources, primarily using Jupyter Notebooks and Python. Explore tutorials, projects, and guides covering foundational to advanced concepts in AI, ML, DL and Gen/Agentic Ai.
mcp -templates
A flexible platform that provides Docker & Kubernetes backends, a lightweight CLI (mcpt), and client utilities for seamless MCP integration. Spin up servers from templates, route requests through a single endpoint with load balancing, and support both deployed (HTTP) and local (stdio) transports — all with sensible defaults and YAML-based configs.