Get the FREE Ultimate OpenClaw Setup Guide →

lingti-bot

🐕⚡ 「极简至上 效率为王 一次编译 到处执行 极速接入」的 AI Bot

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio ruilisi-lingti-bot lingti-bot relay --provider deepseek --api-key sk-xxx

How to use

lingti-bot is an MCP server that exposes AI-assistant capabilities locally via a single, self-contained Go binary. It supports multiple AI providers (e.g., deepseek, claude, kimi, minimax, gemini, openai) and can act as a gateway for various chat platforms through its relay/multi-channel workflow. Use the relay mode to connect to an AI provider and generate responses in-browser or through supported MCP clients. The system includes built-in support for a web chat UI, health diagnostics, and per-channel model configurations, enabling you to run a lightweight, zero-dependency AI assistant across platforms without needing a full Node.js or Python stack.

How to install

Prerequisites:

  • A machine with a compatible OS (Linux/macOS/Windows) and a modern CPU
  • Network access to download the binary or build tools

Option A: Quick install (prebuilt binary)

  1. Run the installation script provided by the project (example shown): curl -fsSL https://files.lingti.com/install-bot.sh | bash
  2. After installation, start the bot in relay mode with your AI provider: lingti-bot relay --provider deepseek --api-key YOUR_API_KEY

Option B: Build from source (Go environment)

  1. Clone the repository: git clone https://github.com/ruilisi/lingti-bot.git
  2. Build the binary: cd lingti-bot make
  3. Run the built binary: ./dist/lingti-bot relay --provider deepseek --api-key YOUR_API_KEY

Notes:

  • The relay command is used to connect to external AI providers. Replace YOUR_API_KEY with your actual API key for the chosen provider.
  • The default port for the built-in web chat UI can be overridden if needed via command options.

Additional notes

Tips and common issues:

  • Health: Use lingti-bot doctor to verify configuration, connections, and dependencies.
  • Per-channel models: You can configure different models per Channel/Agent to tailor responses per platform.
  • Local-first: All data stays on your machine unless you enable cloud-relay features. This mitigates data exposure risk.
  • Docker deployment is supported: you can containerize lingti-bot for easy redeployment.
  • If you run into issues, check that your API keys and provider settings are valid, and ensure network access to provider endpoints.
  • For cloud-relay setups (WeCom/Feishu/Wechat/Slack), follow the docs to configure platform credentials and platform-specific parameters.

Related MCP Servers

Sponsor this space

Reach thousands of developers