Get the FREE Ultimate OpenClaw Setup Guide →

DeepCo

A Chat Client for LLMs, written in Compose Multiplatform.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio succlz123-deepco uvx succlz123/deepco-mcp-server

How to use

DeepCo is a cross-platform chat client that supports configuring MCP servers to talk to various LLM providers. This MCP server acts as a bridge between DeepCo and your chosen language model backends (such as OpenAI, Grok, Google Gemini, DeepSeek, and other OpenAI-compatible APIs) as well as local/private models via LM Studio/Ollama. In DeepCo, open the MCP/Server area in the app and add a new server entry named 'deepco'. Configure the endpoint (by default the local endpoint the server runs on) and provide any required API keys or access tokens for providers you plan to use. The server exposes standard MCP endpoints for chat, prompt management, and character adaptation (via SillyTavern assets) so you can route conversations to the appropriate LLM backend, manage prompts, and use predefined characters during chats.

How to install

Prerequisites:

  • macOS: brew install uv
  • macOS: brew install node (if you plan to use Node-based tooling alongside)
  • Windows: winget install --id=astral-sh.uv -e and install Node.js LTS
  • Linux: ensure uv and Node.js are available via your distro's package manager

Install and run DeepCo MCP server:

  1. Install prerequisites (uv and Node.js) according to your OS:
    • macOS: brew install uv && brew install node
    • Windows: follow MCP ENV instructions to install uv and Node.js
    • Linux: use your distro's package manager to install uv and Node.js
  2. Install the MCP server package (example using uvx): uvx install succlz123/deepco-mcp-server
  3. Run the MCP server: uvx run succlz123/deepco-mcp-server
  4. In the DeepCo desktop app, go to the MCP settings and add a server named 'deepco' pointing to the local endpoint (default port shown by the server). Provide any necessary environment variables (API keys, endpoints) in the server configuration.
  5. If needed, configure provider keys (OPENAI_API_KEY, GROK_API_KEY, GEMINI_API_KEY, etc.) as environment variables for the MCP server.

Notes:

  • The actual package name and run command may vary; follow the exact commands shown by your package manager when installing the 'deepco-mcp-server' package.

Additional notes

Environment variables you may need:

  • OPENAI_API_KEY (for OpenAI/OpenAI-compatible providers)
  • GROK_API_KEY
  • GEMINI_API_KEY
  • OPENROUTER_API_KEY or other provider keys as applicable
  • Any provider-specific BASE_URL or MODEL_NAME configurations Common issues:
  • Server failing to start due to missing API keys—provide keys in the MCP server config or environment before starting.
  • Port conflicts—ensure the MCP server port is not in use by another service and that the DeepCo app points to the correct local endpoint.
  • Firewall or network restrictions may block outbound requests to provider APIs.

Related MCP Servers

LibreChat

34.2k

Enhanced ChatGPT Clone: Features Agents, MCP, DeepSeek, Anthropic, AWS, OpenAI, Responses API, Azure, Groq, o1, GPT-5, Mistral, OpenRouter, Vertex AI, Gemini, Artifacts, AI model switching, message search, Code Interpreter, langchain, DALL-E-3, OpenAPI Actions, Functions, Secure Multi-User Auth, Presets, open-source for self-hosting. Active.

repomix

22.2k

📦 Repomix is a powerful tool that packs your entire repository into a single, AI-friendly file. Perfect for when you need to feed your codebase to Large Language Models (LLMs) or other AI tools like Claude, ChatGPT, DeepSeek, Perplexity, Gemini, Gemma, Llama, Grok, and more.

ai-guide

8.5k

程序员鱼皮的 AI 资源大全 + Vibe Coding 零基础教程,分享大模型选择指南(DeepSeek / GPT / Gemini / Claude)、最新 AI 资讯、Prompt 提示词大全、AI 知识百科(RAG / MCP / A2A)、AI 编程教程、AI 工具用法(Cursor / Claude Code / OpenClaw / TRAE / Lovable / Agent Skills)、AI 开发框架教程(Spring AI / LangChain)、AI 产品变现指南,帮你快速掌握 AI 技术,走在时代前沿。本项目为开源文档版本,已升级为鱼皮 AI 导航网站

unity

6.4k

Unity MCP acts as a bridge, allowing AI assistants (like Claude, Cursor) to interact directly with your Unity Editor via a local MCP (Model Context Protocol) Client. Give your LLM tools to manage assets, control scenes, edit scripts, and automate tasks within Unity.

Everywhere

5.6k

Context-aware AI assistant for your desktop. Ready to respond intelligently, seamlessly integrating multiple LLMs and MCP tools.

dbhub

2.2k

Zero-dependency, token-efficient database MCP server for Postgres, MySQL, SQL Server, MariaDB, SQLite.

Sponsor this space

Reach thousands of developers