papersgpt-for-zotero
A powerful Zotero AI and MCP plugin with ChatGPT, Gemini 3.1, Claude, Grok, DeepSeek, OpenRouter, Kimi 2.5, GLM 5, SiliconFlow, GPT-oss, Gemma 3, Qwen 3.5
claude mcp add --transport stdio papersgpt-papersgpt-for-zotero docker run -i papersgpt-for-zotero \ --env MCP_SSE_URL="http://localhost:9080/sse" \ --env ZOTERO_HOST="http://localhost:23120" \ --env ZOTERO_PLUGIN_PATH="/path/to/papersgpt-for-zotero/plugin"
How to use
PapersGPT for Zotero exposes an MCP server that bridges your Zotero library with a range of AI models and prompts through the PapersGPT plugin ecosystem. The MCP server enables chat and analysis capabilities directly from your Zotero content, supporting fast full-text search, summarization, extraction of key insights, and structured notes insertion back into Zotero. When you connect via MCP, you can use compatible chatbot clients to query Zotero items, run built-in prompts (e.g., summaries, literature reviews, theoretical frameworks), and leverage local or remote LLMs configured in your client to process PDFs and metadata. The integration is designed to be lightweight and fast, emphasizing seamless interaction with Zotero notes and collections while minimizing data transfer to remote services.
To use it, start the MCP server (for example via Docker as described in the installation steps) and point your MCP-enabled client to the server host. In Zotero, enable the PapersGPT plugin and ensure the SSE endpoint at http://localhost:9080/sse is reachable by the MCP host. Then, in your chatbot client, connect to the MCP server using the server name papersgpt-for-zotero and begin issuing queries such as “summarize key findings from [paper], focusing on methodology” or “extract data points on sample size and outcomes.”
How to install
Prerequisites:\n- Docker installed on your system (Windows, macOS, or Linux)\n- Zotero installed with the PapersGPT plugin (as described in the PapersGPT quickstart)\n- Access to the internet for pulling the MCP Docker image (if using the docker approach)\n\nStep-by-step:\n1) Install Docker: follow the official instructions for your OS (https://docs.docker.com/get-docker/).\n2) Pull or build the PapersGPT for Zotero MCP image:\n - If an official image is published: docker pull papersgpt/papersgpt-for-zotero\n - If you have a local image: docker build -t papersgpt-for-zotero .\n3) Run the MCP server:\n docker run -i papersgpt/papersgpt-for-zotero\n Note: If your environment requires ports, consider exposing or mapping the SSE endpoint (for example, -p 9080:9080) and ensure the Zotero plugin can reach http://localhost:9080/sse as described in the server docs.\n4) Configure the Zotero/PapersGPT plugin to communicate with the MCP server, and ensure the SSE URL is reachable by your client.\n5) Verify connectivity by sending a basic query from your MCP-enabled client (e.g., check server status or perform a simple search).\n\nIf you prefer a non-Docker installation, follow the vendor-specific instructions for building or running the native C++ MCP server binary and configure the same environment variables as described in the mcp_config section.
Additional notes
Tips and known considerations:\n- The MCP server is designed to be lightweight and fast, focusing on Zotero integration and local processing. Make sure your Zotero notes and attachment data are within the scope of the plugin’s permissions.\n- Ensure the SSE endpoint (http://localhost:9080/sse) is accessible from your MCP host and firewall rules allow local traffic on the port.\n- If you encounter authentication prompts from Zotero or model providers, configure API keys or OAuth tokens in the PapersGPT plugin as directed in the Quickstart guides.\n- When using Docker, you may need to adjust memory and CPU allocations to optimize performance for large document sets.\n- The server supports a range of LLMs via the client; choose models appropriate to your data privacy and cost constraints.\n- For troubleshooting, verify that the environment variables in mcp_config (ZOTERO_HOST, ZOTERO_PLUGIN_PATH, MCP_SSE_URL) are correctly set to reflect your local setup.
Related MCP Servers
lobehub
The ultimate space for work and life — to find, build, and collaborate with agent teammates that grow with you. We are taking agent harness to the next level — enabling multi-agent collaboration, effortless agent team design, and introducing agents as the unit of work interaction.
chatgpt-on-wechat
CowAgent是基于大模型的超级AI助理,能主动思考和任务规划、访问操作系统和外部资源、创造和执行Skills、拥有长期记忆并不断成长。同时支持飞书、钉钉、企业微信应用、微信公众号、网页等接入,可选择OpenAI/Claude/Gemini/DeepSeek/ Qwen/GLM/Kimi/LinkAI,能处理文本、语音、图片和文件,可快速搭建个人AI助手和企业数字员工。
LibreChat
Enhanced ChatGPT Clone: Features Agents, MCP, DeepSeek, Anthropic, AWS, OpenAI, Responses API, Azure, Groq, o1, GPT-5, Mistral, OpenRouter, Vertex AI, Gemini, Artifacts, AI model switching, message search, Code Interpreter, langchain, DALL-E-3, OpenAPI Actions, Functions, Secure Multi-User Auth, Presets, open-source for self-hosting. Active.
repomix
📦 Repomix is a powerful tool that packs your entire repository into a single, AI-friendly file. Perfect for when you need to feed your codebase to Large Language Models (LLMs) or other AI tools like Claude, ChatGPT, DeepSeek, Perplexity, Gemini, Gemma, Llama, Grok, and more.
ai-guide
程序员鱼皮的 AI 资源大全 + Vibe Coding 零基础教程,分享大模型选择指南(DeepSeek / GPT / Gemini / Claude)、最新 AI 资讯、Prompt 提示词大全、AI 知识百科(RAG / MCP / A2A)、AI 编程教程、AI 工具用法(Cursor / Claude Code / OpenClaw / TRAE / Lovable / Agent Skills)、AI 开发框架教程(Spring AI / LangChain)、AI 产品变现指南,帮你快速掌握 AI 技术,走在时代前沿。本项目为开源文档版本,已升级为鱼皮 AI 导航网站
Everywhere
Context-aware AI assistant for your desktop. Ready to respond intelligently, seamlessly integrating multiple LLMs and MCP tools.