Get the FREE Ultimate OpenClaw Setup Guide →

mcp-aurai

MCP Aurai Server - AI顾问服务器

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio lzmw-mcp-aurai-server python -m mcp_aurai.server \
  --env AURAI_MODEL="glm-4.7" \
  --env AURAI_API_KEY="your-api-key" \
  --env AURAI_BASE_URL="https://open.bigmodel.cn/api/paas/v4/" \
  --env AURAI_MAX_TOKENS="32000 (optional)" \
  --env AURAI_MAX_HISTORY="50 (optional)" \
  --env AURAI_TEMPERATURE="0.7 (optional)" \
  --env AURAI_CONTEXT_WINDOW="200000 (optional, default for GLM-4.7)" \
  --env AURAI_MAX_MESSAGE_TOKENS="150000 (optional)"

How to use

mcp-aurai is an MCP server that provides an AI advisor workflow leveraging a custom OpenAI-compatible provider. It exposes tools such as consult_aurai for guiding programming-related questions, sync_context for uploading code or documentation to give the upper AI richer context, report_progress to communicate advancement steps, and get_status to inspect current configuration and state. The server is optimized around GLM-4.7 with a large context window and supports file uploads with chunking for large documents. To start using it, ensure the server is running with the proper environment variables (API key, base URL, and model) and then configure Claude Code (or your MCP client) to point at the server. Once active, describe your programming issue in the chat; the system will automatically decide when to invoke consult_aurai, and you can attach or reference code and documents via sync_context as needed.

How to install

Prerequisites:

  • Python 3.11+ installed
  • Git installed
  • Basic familiarity with Python virtual environments
  1. Clone the repository and navigate into it: git clone https://github.com/your-org/mcp-aurai-server.git cd mcp-aurai-server

  2. (Optional but recommended) Create and activate a virtual environment: python -m venv venv venv\Scripts\activate # Windows source venv/bin/activate # macOS/Linux

  3. Install development dependencies in editable mode: pip install -e ".[all-dev]"

  4. Create or copy the environment example and fill in your credentials: cp .env.example .env

    Edit .env to include AURAI_API_KEY, AURAI_BASE_URL, AURAI_MODEL, etc.

  5. Run the MCP server module (as configured in mcp_config): python -m mcp_aurai.server

  6. Verify installation by running a quick test, for example using the included test utility: python .ai_temp/test_file_upload_fix.py

    Expected: all tests pass

  7. If using Claude Code, add the MCP server configuration to Claude mcp settings (see README for exact commands).

Additional notes

Tips and common issues:

  • Ensure AURAI_BASE_URL, AURAI_API_KEY, and AURAI_MODEL are set and valid; incorrect keys or endpoints will cause authentication or model-not-found errors.
  • The default token settings are tuned for GLM-4.7 (context window ~200k tokens, max_message_tokens ~150k, max_tokens ~32k). You can override via AURAI_CONTEXT_WINDOW, AURAI_MAX_MESSAGE_TOKENS, and AURAI_MAX_TOKENS if you use another model.
  • For large file uploads, the sync_context feature will split files into chunks automatically. If you notice missing content in the upper AI, verify that AURAI_MAX_MESSAGE_TOKENS is sufficient for the inbound chunk size.
  • When upgrading from older versions, the v2.2.0 migration guidance in the README explains how to switch providers to the custom OpenAI-compatible API.
  • Use --scope user in Claude Code to ensure the MCP configuration persists across sessions.
  • If you encounter 401 or 404 errors, re-check the API key and model name, and ensure the base URL matches the provider's API (OpenAI-compatible endpoint recommended).
  • The tools (consult_aurai, sync_context, report_progress, get_status) are designed to minimize history growth and automatically clear history when the upper AI resolves the query.

Related MCP Servers

Sponsor this space

Reach thousands of developers