Get the FREE Ultimate OpenClaw Setup Guide →

mcp-zenodo

Tool-based LLM integration with Zenodo via the Model Context Protocol (MCP)

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio mskazemi-mcp-zenodo uvx mcp_api

How to use

Zenodo MCP provides two complementary implementations to interact with Zenodo records via the Model Context Protocol. The MCP API implementation exposes a FastAPI-based service that can be integrated with LangChain, LangGraph, and OpenAI-compatible clients, enabling search, retrieval, metadata access, citations, and file downloads as MCP tools. The MCP SDK Core offers a Python-based MCP server designed for direct integration with development environments like Cursor IDE, providing MCP-compliant access to Zenodo data without the need for extra middleware. Together, they let you either embed Zenodo capabilities into AI workflows (via the API) or run a lightweight, environment-embedded MCP service (via the SDK Core).

How to install

Prerequisites:

  • Python 3.9+ and pip
  • Git
  • (Optional) Node.js if you plan to use additional tooling, but not required for the Python implementations

Install and set up both implementations:

  1. Clone the repository

    git clone https://github.com/yourusername/zenodo-mcp.git

  2. Install and run the MCP SDK Core (Cursor IDE integration)

    cd zenodo-mcp/mcp_sdk_core python -m venv venv source venv/bin/activate # on Windows use venv\Scripts\activate pip install -r requirements.txt

    Follow the SDK Core README for Cursor-specific configuration (mcp.json) and integration steps.

  3. Install and run the MCP API (LangChain/LangGraph/OpenAI-compatible API)

    cd zenodo-mcp/mcp_api python -m venv venv source venv/bin/activate pip install -r requirements.txt

    Set up environment variables (create .env from .env.example and provide your Zenodo API token)

    cp .env.example .env nano .env # or your preferred editor

    Run the API server

    uvicorn server.main:app --host 0.0.0.0 --port 8000

Notes:

  • If you intend to expose the API to other services, you may run behind a reverse proxy or containerize the application.
  • Ensure your Zenodo API token has appropriate scopes for the operations you plan to perform.

Additional notes

Tips and common issues:

  • Ensure Python virtual environments are activated when installing dependencies for each component.
  • If you encounter port conflicts, change the port in the uvicorn command (e.g., --port 8001).
  • For the API, store sensitive tokens in a dedicated environment file (.env) and do not commit it to version control.
  • The SDK Core is intended for Cursor IDE integration; refer to the mcp_sdk_core/README.md for Cursor-specific configuration nuances.
  • If you must run in Docker, convert the python-based components into appropriate container images and adjust the mcp_config accordingly (docker command format).

Related MCP Servers

CoexistAI

440

CoexistAI is a modular, developer-friendly research assistant framework . It enables you to build, search, summarize, and automate research workflows using LLMs, web search, Reddit, YouTube, and mapping tools—all with simple MCP tool calls or API calls or Python functions.

fullstack-langgraph-nextjs-agent

85

Production-ready Next.js template for building AI agents with LangGraph.js. Features MCP integration for dynamic tool loading, human-in-the-loop tool approval, persistent conversation memory with PostgreSQL, and real-time streaming responses. Built with TypeScript, React, Prisma, and Tailwind CSS.

MCP-MultiServer-Interoperable-Agent2Agent-LangGraph-AI-System

24

This project demonstrates a decoupled real-time agent architecture that connects LangGraph agents to remote tools served by custom MCP (Modular Command Protocol) servers. The architecture enables a flexible and scalable multi-agent system where each tool can be hosted independently (via SSE or STDIO), offering modularity and cloud-deployable execut

ToolRAG

22

Unlimited LLM tools, zero context penalties — ToolRAG serves exactly the LLM tools your user-query demands.

ai-learning

17

AI Learning: A comprehensive repository for Artificial Intelligence and Machine Learning resources, primarily using Jupyter Notebooks and Python. Explore tutorials, projects, and guides covering foundational to advanced concepts in AI, ML, DL and Gen/Agentic Ai.

mcp -templates

15

A flexible platform that provides Docker & Kubernetes backends, a lightweight CLI (mcpt), and client utilities for seamless MCP integration. Spin up servers from templates, route requests through a single endpoint with load balancing, and support both deployed (HTTP) and local (stdio) transports — all with sensible defaults and YAML-based configs.

Sponsor this space

Reach thousands of developers