deepchat
🐬DeepChat - A smart assistant that connects powerful AI to your personal world
claude mcp add --transport stdio thinkinaixyz-deepchat node server.js \ --env DEEPCHAT_PORT="8080" \ --env DEEPCHAT_API_KEY="your-api-key-if-needed" \ --env DEEPCHAT_LOG_LEVEL="info"
How to use
DeepChat is an open-source AI agent platform that unifies models, tools, and agents within a single interface and provides MCP (Model Context Protocol) support for tool calling. This MCP server enables co-ordination between the chat model and external tools or services, allowing your agents to perform actions such as code execution, web access, data retrieval, and other tasks through standardized tool calls. By running the server, you expose an MCP-compatible endpoint that clients (including the DeepChat desktop/web app) can query to discover Resources, Prompts, and Tools, and to execute tool calls in a structured, debuggable way. The platform also integrates ACP (Agent Client Protocol) so you can connect external agents as first-class models within workflows. Use this server to empower your AI agents to reason over tools, orchestrate actions, and return structured results to end-users.
How to install
Prerequisites:
- Node.js (18.x or newer) and npm/yarn installed
- Basic Git and shell access
- Clone the repository
- git clone https://github.com/ThinkInAIXYZ/deepchat.git
- cd deepchat
- Install dependencies
- npm install or
- yarn install
- Build (if required by the project setup)
- npm run build (or the project’s specific build script)
- Run the MCP server
- npm run start or
- node server.js
Notes:
- If the project uses a different start script, check package.json for the exact command. Output will typically bind to a port (default 8080) unless overridden by environment variables.
- Configure environment variables as described in the mcp_config section before starting the server.
Additional notes
Tips and common considerations:
- Security: If you expose the MCP endpoint publicly, secure it with authentication and restrict tool access as needed.
- Environment variables: You may need to set API keys, model endpoints, or port numbers via DEEPCHAT_PORT, DEEPCHAT_API_KEY, etc., depending on your setup.
- Tool debugging: DeepChat’s MCP tooling supports a dedicated debugging window for tool calls. Use this to inspect params and returns during development.
- ACP integration: If you plan to connect external ACP agents, ensure compatibility and provide any required workspace context or credentials.
- Logging: Increase log levels (e.g., DEEPCHAT_LOG_LEVEL=debug) during development to diagnose MCP interactions and tool calls.
Related MCP Servers
AstrBot
Agentic IM Chatbot infrastructure that integrates lots of IM platforms, LLMs, plugins and AI feature, and can be your openclaw alternative. ✨
cursor-talk-to-figma
TalkToFigma: MCP integration between AI Agent (Cursor, Claude Code) and Figma, allowing Agentic AI to communicate with Figma for reading designs and modifying them programmatically.
archestra
Secure cloud-native MCP registry, gateway & orchestrator
better-chatbot
Just a Better Chatbot. Powered by Agent & MCP & Workflows.
argo
ARGO is an open-source AI Agent platform that brings Local Manus to your desktop. With one-click model downloads, seamless closed LLM integration, and offline-first RAG knowledge bases, ARGO becomes a DeepResearch powerhouse for autonomous thinking, task planning, and 100% of your data stays locally. Support Win/Mac/Docker.
building-an-agentic-system
An in-depth book and reference on building agentic systems like Claude Code