nexus-ai
Nexus AI - The central hub for AI conversations. Native desktop chat app with multi-provider support (Gemini, OpenAI, Groq), MCP tool integration, and multimodal capabilities. Built with Tauri v2, React 19, and TypeScript.
claude mcp add --transport stdio navjotdhanawat-nexus-ai npx -y @modelcontextprotocol/server-filesystem /path/to/directory
How to use
Nexus is a native desktop AI chat application with built-in multi-provider LLM support and integrated MCP (Model Context Protocol) tooling. It allows you to connect to MCP servers via stdio transports and execute tools directly from chat conversations. The MCP integration enables you to swap between providers and toolsets (e.g., Google Gemini, OpenAI, Groq) and use multimodal capabilities such as vision analysis, audio transcription, and image generation within a single conversation. Use the Command Palette (Cmd+K) to access actions and configure MCP servers in Preferences → MCP Servers to point the app at an MCP server or transport that suits your workflow.
How to install
Prerequisites:
- Node.js v18+ (for development and MCP server tooling)
- Rust (latest stable) for building the Tauri-powered desktop app
- Xcode Command Line Tools (macOS)
Installation steps:
-
Clone the repository: git clone https://github.com/navjotdhanawat/nexus.git cd nexus
-
Install dependencies: npm install
-
Run the development server (for the desktop app): npm run tauri:dev
-
Build for production (produces .app and .dmg on macOS): npm run tauri:build
The output will be under src-tauri/target/release/bundle/
-
Configure MCP servers in the app:
- Open Preferences → MCP Servers
- Add a server with the MCP transport you intend to use (e.g., stdio via the provided example)
- Example settings can mirror: { "name": "My MCP Server", "transport": "stdio", "command": "npx", "args": ["-y", "@modelcontextprotocol/server-filesystem", "/path/to/directory"] }
Additional notes
Tips and caveats:
- The MCP server in the example uses npx to launch the Model Context Protocol server from the filesystem. Ensure the package name and path are correct for your environment.
- For development, set API keys via environment variables as shown in the README (e.g., VITE_GOOGLE_API_KEY, VITE_OPENAI_API_KEY, VITE_GROQ_API_KEY).
- The app relies on multiple providers; you’ll need corresponding API keys and access to the provider endpoints configured in Preferences → API Keys.
- If you run into transport issues, verify that the MCP server process is reachable via the chosen transport (stdio or HTTP/SSE) and that the server binary or script has the appropriate permissions on your OS.
Related MCP Servers
AstrBot
Agentic IM Chatbot infrastructure that integrates lots of IM platforms, LLMs, plugins and AI feature, and can be your openclaw alternative. ✨
mcp-agent
Build effective agents using Model Context Protocol and simple workflow patterns
openapi
OpenAPI definitions, converters and LLM function calling schema composer.
neurolink
Universal AI Development Platform with MCP server integration, multi-provider support, and professional CLI. Build, test, and deploy AI applications with multiple ai providers.
mcp
🤖 A Model Context Protocol (MCP) library for use with Agentic chat bots
goai
AI SDK for building AI-powered applications in Go