Get the FREE Ultimate OpenClaw Setup Guide →

fullscope

No-RAG,内容总结、主题汇总、知识抽取 MCP

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio yzfly-fullscope-mcp-server uvx fullscope-mcp-server \
  --env OPENAI_MODEL="MiniMax-M1" \
  --env OPENAI_API_KEY="your-minimax-api-key" \
  --env OPENAI_BASE_URL="https://api.minimaxi.com/v1" \
  --env MAX_INPUT_TOKENS="120000" \
  --env MAX_OUTPUT_TOKENS="8000"

How to use

FullScope-MCP is a content summarization and operations-oriented MCP server. It combines web scraping, file reading, and multiple summarization capabilities (including page-level, content-level, and topic-based summaries) with direct model calls. You can invoke model reasoning directly, fetch and save webpage content, summarize long texts, summarize PDFs and TXT files, and perform topic-based searches to surface the most relevant excerpts. The server is designed to work with the MiniMax API by default but can be pointed at other OpenAI-compatible endpoints via environment configuration.

To use the server, configure the environment with your API key and model settings, then run the server using the uvx-based installation. Once running, you can issue calls through the MCP interface to tools like call_model, scrape_webpage, summarize_content, summarize_webpage, read_and_summarize_text_file, read_and_summarize_pdf_file, and topic_based_summary. These tools enable end-to-end workflows such as extracting content from a URL, producing concise summaries at a chosen compression ratio, reading local documents for summarization, and performing topic-driven RG-like summaries on provided material.

How to install

Prerequisites:\n- Python 3.8+ and pip\n- Internet access to fetch dependencies\n\nInstallation options:\n1) Install via uvx (recommended):\nbash\nuvx fullscope-mcp-server\n\n2) Install via pip (from PyPI):\nbash\npip install fullscope-mcp-server\n\n3) From source (editable):\nbash\ngit clone https://github.com/yzfly/fullscope-mcp\ncd fullscope-mcp\npip install -e .\n\n\nAfter installation, ensure your environment is prepared with the required API configuration (see the environment variables in the README).

Additional notes

Tips and common issues:\n- Ensure OPENAI_API_KEY is set; without it, requests to the model will fail.\n- If using a non-MiniMax model, set OPENAI_BASE_URL to the corresponding API endpoint and choose the appropriate OPENAI_MODEL.\n- The content sizing limits: text files and PDFs are summarized within ~120k characters, page content may be truncated to fit the model context, and topic-based summaries are limited to ~2k characters.\n- When using Claude Desktop integration, the provided JSON config examples show how to map environment variables and command arguments for seamless usage.\n- For long documents, consider pre-splitting content or increasing MAX_INPUT_TOKENS where supported by the chosen model.\n- If you upgrade dependencies, re-run tests (pytest) and validate that all tools respond as expected.\n- Optional: you can run the server in a virtual environment to isolate dependencies.\n- This MCP server uses UV-based execution for Python; you can also run via source module invocation (e.g., python -m fullscope_mcp_server) if installed from source.

Related MCP Servers

Sponsor this space

Reach thousands of developers