MCP-SuperAssistant
Brings MCP to ChatGPT, DeepSeek, Perplexity, Grok, Gemini, Google AI Studio, OpenRouter, DeepSeek, T3 Chat and more...
claude mcp add --transport stdio srbhptl39-mcp-superassistant npx -y @srbhptl39/mcp-superassistant-proxy@latest --config ./config.json --outputTransport sse
How to use
MCP SuperAssistant provides a Chrome extension that integrates MCP tools with a wide range of AI platforms (ChatGPT, Google Gemini, Perplexity, Grok, Google AI Studio, OpenRouter, Kimi, GitHub Copilot, Mistral, Qwen, Z Chat, and more). To use it locally, run the MCP SuperAssistant proxy in conjunction with a local or remote MCP server configuration. The proxy connects your AI platform to MCP tools via a configured transport (default SSE in the example). Once the proxy is running and connected to a server, you can invoke MCP tools from supported platforms and have results returned into your chat interface. The extension itself handles tool detection, execution, and result insertion, while the proxy forwards tool calls to the appropriate MCP server and returns responses.
Typical workflow: write or load a config.json describing your MCP servers, start the proxy, then connect your AI platform’s MCP sidebar to http://localhost:3006/sse (or the transport endpoint you choose). From there you can select and run MCP tools directly within the chat interface, see results rendered in the conversation, and leverage features like auto-execute and result insertion as configured.
How to install
Prerequisites:
- Node.js (14.x or newer) and npm installed on your system
- Basic familiarity with running commands in a terminal
Step 1: Create a working directory for your MCP setup
- mkdir mcp-setup && cd mcp-setup
Step 2: Prepare a config.json describing your MCP servers
- Create a file named config.json with your MCP server definitions. Example: { "mcpServers": { "desktop-commander": { "command": "npx", "args": [ "-y", "@wonderwhy-er/desktop-commander" ] } } }
Step 3: Install and run the MCP proxy via npx
- This uses the MCP SuperAssistant proxy package
- Run one of the following commands (example using SSE): npx -y @srbhptl39/mcp-superassistant-proxy@latest --config ./config.json --outputTransport sse
Optional transports you can use instead:
- npx -y @srbhptl39/mcp-superassistant-proxy --config ./config.json --outputTransport streamableHttp
- npx -y @srbhptl39/mcp-superassistant-proxy --config ./config.json --outputTransport ws
View all available options:
- npx -y @srbhptl39/mcp-superassistant-proxy@latest --help
Step 4: Connect your AI platform
- Open the MCP SuperAssistant sidebar in your chosen platform and connect to the local proxy endpoint, typically http://localhost:3006/sse for SSE.
- Ensure the server status shows Connected before trying to run tools.
Additional notes
Tips and common issues:
- If you change config.json, restart the proxy to pick up changes.
- Use a local MCP server for best performance and fewer CORS issues; remote servers can be proxied via the same config.
- Ensure your transport endpoint matches what you configured (SSE, Streamable HTTP, or WebSocket).
- The npm package used here is @srbhptl39/mcp-superassistant-proxy. If you’re testing locally, you can run it with npx as shown in the steps above.
- For debugging, check the proxy’s health endpoints and logs to verify connectivity to the MCP servers.
- You can add multiple MCP servers under mcpServers to multiplex tool calls across sources.
- If your environment blocks the default ports, configure the proxy to listen on an alternate port as supported by the proxy tool.
Related MCP Servers
lobehub
The ultimate space for work and life — to find, build, and collaborate with agent teammates that grow with you. We are taking agent harness to the next level — enabling multi-agent collaboration, effortless agent team design, and introducing agents as the unit of work interaction.
AstrBot
Agentic IM Chatbot infrastructure that integrates lots of IM platforms, LLMs, plugins and AI feature, and can be your openclaw alternative. ✨
rikkahub
RikkaHub is an Android APP that supports for multiple LLM providers.
UnrealGenAISupport
An Unreal Engine plugin for LLM/GenAI models & MCP UE5 server. Includes OpenAI's GPT 5.1, Deepseek V3.1, Claude Sonnet 4.5 APIs, Gemini 3, Alibaba Qwen, Kimi and Grok 4.1, with plans to add Gemini, audio tts, elevenlabs, OpenRouter, Groq, Dashscope & realtime APIs soon. UnrealMCP is also here!! Automatic scene generation from AI!!
daan
✨Lightweight LLM Client with MCP 🔌 & Characters 👤
chatgpt-copilot
ChatGPT Copilot Extension for Visual Studio Code