Get the FREE Ultimate OpenClaw Setup Guide →

witsy

Witsy: desktop AI assistant / universal MCP client

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio nbonamy-witsy node server.js

How to use

Witsy is a universal MCP client and desktop AI assistant designed to run MCP servers with a wide range of large language models and providers. It acts as a mediator between local or remote model endpoints and MCP servers, enabling you to leverage multiple providers (OpenAI, Anthropic, Google, Ollama, Mistral, and more) through a single desktop application. With Witsy you can manage prompts, run chat completions, perform image/video generation, transcription, speech-to-text, and various RAG workflows, all within the MCP framework. To use it, install the Witsy client, provide your API keys for the providers you intend to use, and start the MCP server integration from the app. You can also utilize features like AI Commands, Prompt Anywhere, and document-based RAG to extend capabilities across applications.

How to install

Prerequisites:

  • Node.js (LTS) and npm installed on your machine
  • A supported MCP server configuration (as described in the README of the server)

Installation steps:

  1. Clone the repository: git clone https://github.com/nbonamy/witsy.git cd witsy
  2. Install dependencies: npm ci
  3. Create or configure your environment with necessary API keys and settings for the providers you plan to use (OpenAI, Anthropic, Google, Ollama, etc.).
  4. Start the application (the MCP server integration is typically started via npm start): npm start
  5. Access the Witsy UI and connect to the MCP server configuration named "witsy" if prompted, or rely on the embedded MCP server setup as documented in the app.

Additional notes

Tips and common considerations:

  • Ensure you have valid API keys for the providers you enable (OpenAI, Anthropic, Google, etc.). Witsy often requires these keys to be set in the application settings or environment variables.
  • If you use local models via Ollama or similar, ensure those services are installed and accessible from your machine.
  • For RAG and document-based chats, configure your document repositories within Witsy to enable fast retrieval.
  • If you encounter port or network issues, verify that the MCP server configuration name (e.g., witsy) is correctly referenced and that the environment allows outbound API calls.
  • Regularly update dependencies and keep an eye on OpenAI and other provider rate limits to avoid avoided requests or throttling.

Related MCP Servers

Sponsor this space

Reach thousands of developers