mcp-oi-wiki
🌟 Wiki of OI / ICPC for LLMs. (面向大模型的某大型游戏线上攻略,内含炫酷算术魔法)
claude mcp add --transport stdio shwstone-mcp-oi-wiki uv --directory <path of MCP servers>/mcp-oi-wiki run python main.py
How to use
This MCP server integrates OI-Wiki content by summarizing all 462 pages with Deepseek-V3, embedding the summaries as vectors, and storing them in a vector database. When you query, the system searches for the closest vector in the database and returns the corresponding wiki markdown. To use it, ensure you have the uv tool installed and configure the server in your MCP setup as shown in the README. Once running, you can query oi-wiki to obtain concise, wiki-formatted explanations derived from OI-Wiki content.
From the repository, the workflow is: the code ingests OI-Wiki pages, creates embeddings, and updates a local vector store (db/oi-wiki.db). You can generate or refresh this database by running the update flow: generate a new db file with the embeddings and ensure the Silicon Flow API key is available if you rely on Silicon Flow for embeddings. Queries return the relevant markdown content from OI-Wiki, allowing you to cite methods, definitions, and strategies directly in your chat or documentation.
How to install
Prerequisites:
- Python 3.8+ and a functioning Python environment
- Node.js/npm or the uv tool as described in the repository (uv is used here)
- Git
Installation steps:
- Install the uv tool if you don’t have it:
- Follow the uv installation instructions from the official source (ensure it’s available in your PATH).
- Clone the MCP server repository with submodules:
git clone --recurse-submodules https://github.com/ShwStone/mcp-oi-wiki.git cd mcp-oi-wiki - Ensure dependencies are available:
- Confirm Python is installed and accessible (python --version).
- If the repository provides a requirements.txt or similar, install dependencies:
python -m pip install -r requirements.txt
- Prepare the configuration file as shown in the README, replacing placeholders with your paths:
- Create or edit your MCP configuration to include the oi-wiki server (see mcp_config example).
- Run the server via uv as configured in your mcp_config:
uv --directory <path of MCP servers>/mcp-oi-wiki run python main.py - Optional: place your Silicon Flow API key in api.key if you intend to generate/update the db via Silicon Flow processes.
Additional notes
Tips and notes:
- The update flow relies on the Silicon Flow API key placed in api.key to generate embeddings and may require network access to the Silicon Flow service.
- The system stores embeddings in db/oi-wiki.db; keep this file in a persistent location if you plan to reuse or refresh the database.
- If you modify the OI-Wiki source or add new pages, re-run the update steps (requesting summaries and regenerating the db) to keep results current.
- Ensure the path provided in the --directory argument points to the mcp-oi-wiki folder where main.py resides.
- If you encounter issues with dependencies, verify Python environment activation and that main.py is executable with Python.
Related MCP Servers
web-eval-agent
An MCP server that autonomously evaluates web applications.
mcp-neo4j
Neo4j Labs Model Context Protocol servers
Gitingest
mcp server for gitingest
zotero
Model Context Protocol (MCP) server for the Zotero API, in Python
fhir
FHIR MCP Server – helping you expose any FHIR Server or API as a MCP Server.
unitree-go2
The Unitree Go2 MCP Server is a server built on the MCP that enables users to control the Unitree Go2 robot using natural language commands interpreted by a LLM.