lingti-bot
🐕⚡ 「极简至上 效率为王 一次编译 到处执行 极速接入」的 AI Bot
claude mcp add --transport stdio ruilisi-lingti-bot lingti-bot relay --provider deepseek --api-key sk-xxx
How to use
lingti-bot is an MCP server that exposes AI-assistant capabilities locally via a single, self-contained Go binary. It supports multiple AI providers (e.g., deepseek, claude, kimi, minimax, gemini, openai) and can act as a gateway for various chat platforms through its relay/multi-channel workflow. Use the relay mode to connect to an AI provider and generate responses in-browser or through supported MCP clients. The system includes built-in support for a web chat UI, health diagnostics, and per-channel model configurations, enabling you to run a lightweight, zero-dependency AI assistant across platforms without needing a full Node.js or Python stack.
How to install
Prerequisites:
- A machine with a compatible OS (Linux/macOS/Windows) and a modern CPU
- Network access to download the binary or build tools
Option A: Quick install (prebuilt binary)
- Run the installation script provided by the project (example shown): curl -fsSL https://files.lingti.com/install-bot.sh | bash
- After installation, start the bot in relay mode with your AI provider: lingti-bot relay --provider deepseek --api-key YOUR_API_KEY
Option B: Build from source (Go environment)
- Clone the repository: git clone https://github.com/ruilisi/lingti-bot.git
- Build the binary: cd lingti-bot make
- Run the built binary: ./dist/lingti-bot relay --provider deepseek --api-key YOUR_API_KEY
Notes:
- The relay command is used to connect to external AI providers. Replace YOUR_API_KEY with your actual API key for the chosen provider.
- The default port for the built-in web chat UI can be overridden if needed via command options.
Additional notes
Tips and common issues:
- Health: Use lingti-bot doctor to verify configuration, connections, and dependencies.
- Per-channel models: You can configure different models per Channel/Agent to tailor responses per platform.
- Local-first: All data stays on your machine unless you enable cloud-relay features. This mitigates data exposure risk.
- Docker deployment is supported: you can containerize lingti-bot for easy redeployment.
- If you run into issues, check that your API keys and provider settings are valid, and ensure network access to provider endpoints.
- For cloud-relay setups (WeCom/Feishu/Wechat/Slack), follow the docs to configure platform credentials and platform-specific parameters.
Related MCP Servers
mcp-language
mcp-language-server gives MCP enabled clients access semantic tools like get definition, references, rename, and diagnostics.
flux-operator
GitOps on Autopilot Mode
kodit
👩💻 MCP server to index external repositories
github-brain
An experimental GitHub MCP server with local database.
miniflux
A Model Context Protocol (MCP) server for interacting with Miniflux RSS reader.
ai-create
ai-create-mcp is a Go-based tool that converts OpenAPI Specification (OAS) files into a Model Context Protocol (MCP) program.