witsy
Witsy: desktop AI assistant / universal MCP client
claude mcp add --transport stdio nbonamy-witsy node server.js
How to use
Witsy is a universal MCP client and desktop AI assistant designed to run MCP servers with a wide range of large language models and providers. It acts as a mediator between local or remote model endpoints and MCP servers, enabling you to leverage multiple providers (OpenAI, Anthropic, Google, Ollama, Mistral, and more) through a single desktop application. With Witsy you can manage prompts, run chat completions, perform image/video generation, transcription, speech-to-text, and various RAG workflows, all within the MCP framework. To use it, install the Witsy client, provide your API keys for the providers you intend to use, and start the MCP server integration from the app. You can also utilize features like AI Commands, Prompt Anywhere, and document-based RAG to extend capabilities across applications.
How to install
Prerequisites:
- Node.js (LTS) and npm installed on your machine
- A supported MCP server configuration (as described in the README of the server)
Installation steps:
- Clone the repository: git clone https://github.com/nbonamy/witsy.git cd witsy
- Install dependencies: npm ci
- Create or configure your environment with necessary API keys and settings for the providers you plan to use (OpenAI, Anthropic, Google, Ollama, etc.).
- Start the application (the MCP server integration is typically started via npm start): npm start
- Access the Witsy UI and connect to the MCP server configuration named "witsy" if prompted, or rely on the embedded MCP server setup as documented in the app.
Additional notes
Tips and common considerations:
- Ensure you have valid API keys for the providers you enable (OpenAI, Anthropic, Google, etc.). Witsy often requires these keys to be set in the application settings or environment variables.
- If you use local models via Ollama or similar, ensure those services are installed and accessible from your machine.
- For RAG and document-based chats, configure your document repositories within Witsy to enable fast retrieval.
- If you encounter port or network issues, verify that the MCP server configuration name (e.g., witsy) is correctly referenced and that the environment allows outbound API calls.
- Regularly update dependencies and keep an eye on OpenAI and other provider rate limits to avoid avoided requests or throttling.
Related MCP Servers
repomix
📦 Repomix is a powerful tool that packs your entire repository into a single, AI-friendly file. Perfect for when you need to feed your codebase to Large Language Models (LLMs) or other AI tools like Claude, ChatGPT, DeepSeek, Perplexity, Gemini, Gemma, Llama, Grok, and more.
SearChat
Search + Chat = SearChat(AI Chat with Search), Support OpenAI/Anthropic/VertexAI/Gemini, DeepResearch, SearXNG, Docker. AI对话式搜索引擎,支持DeepResearch, 支持OpenAI/Anthropic/VertexAI/Gemini接口、聚合搜索引擎SearXNG,支持Docker一键部署。
better-chatbot
Just a Better Chatbot. Powered by Agent & MCP & Workflows.
mcp-gateway
A plugin-based gateway that orchestrates other MCPs and allows developers to build upon it enterprise-grade agents.
mcp-toolbox-sdk-python
Python SDK for interacting with the MCP Toolbox for Databases.
openapi
OpenAPI definitions, converters and LLM function calling schema composer.