mcp-chat-studio
A powerful MCP testing tool with multi-provider LLM support (Ollama, OpenAI, Claude, Gemini). Test, debug, and develop MCP servers with a modern UI.
claude mcp add --transport stdio joecastrom-mcp-chat-studio node server.js \ --env PORT="3082" \ --env Ollama="optional - used for local no-api-key workflows"
How to use
MCP Chat Studio is a comprehensive testing and development environment for MCP servers. It provides a glassmorphism UI with Workspace mode, a Studio Assistant for guided workflows, and capabilities to record, replay, and inspect MCP interactions. The platform supports workflow export to Python and Node SDKs, mock server generation, and a docs generator to build run reports and documentation from your tool definitions. You can switch between Classic and Workspace layouts, choose from multiple LLM providers via the header, and manage OAuth and model settings through the built-in Settings panel. Use the Studio Assistant to learn features, generate workflows, import OpenAPI specs, and navigate the UI to build and test end-to-end MCP scenarios.
To start testing locally, run the server and open the UI in your browser. The tool is designed for offline/local use (thanks to Ollama support in the Studio), so you don’t need external API keys for basic testing. You can create and run scenarios, record interactions, and export workflows forCI or docs generation. The platform also includes advanced debugging features like a workflow debugger with breakpoints, systematic tool usage analytics, and a comprehensive inspector for tracing JSON-RPC messages and performance.
How to install
Prerequisites:
- Node.js 18+ and npm installed on your system
- Git
- Optional: Ollama for local model serving (no API keys required)
Installation steps:
-
Clone the repository: git clone https://github.com/JoeCastrom/mcp-chat-studio.git cd mcp-chat-studio
-
Install dependencies: npm install
-
Start the development server: npm run dev
-
Open the application in your browser: http://localhost:3082
Notes:
- The default port for the UI is 3082; ensure that port is available or adjust the PORT environment variable if needed.
- If you’re using Ollama for local model serving, ensure Ollama is installed and running on your machine. The Studio is designed to work without API keys for local testing.
- If you plan to build Python/Node exports or run docs generation, ensure you have Python and Node SDKs available in your environment as needed by your workflow.
Upgrade tip: pull the latest v2 branch or release as described in the repository, since major updates can introduce changes to workflows and UI features.
Additional notes
Tips and common issues:
- Ensure PORT 3082 is free or override with an environment variable to avoid port conflicts.
- If the UI fails to load, check that npm install completed successfully and that node server.js exists at the expected path. Adjust the entry script path if your setup uses a different file (e.g., index.js).
- For OAuth configurations, use the UI to configure providers without touching config.yaml or .env, as documented in the OAuth Settings UI feature.
- When exporting workflows to Python/Node SDKs, review the generated code for correct tool calls and variable substitutions, and adapt to your local environment if needed.
- If you encounter issues with local testing not exposing endpoints to the internet, rely on the local-only testing paradigm and use Ollama for in-machine model serving to stay offline.
- The Studio Assistant is context-aware; using it for generating workflows can accelerate common tasks like importing OpenAPI specs and building tool chains.
Related MCP Servers
Everywhere
Context-aware AI assistant for your desktop. Ready to respond intelligently, seamlessly integrating multiple LLMs and MCP tools.
Risuai
Make your own story. User-friendly software for LLM roleplaying
chatgpt-copilot
ChatGPT Copilot Extension for Visual Studio Code
openapi
OpenAPI definitions, converters and LLM function calling schema composer.
local -gateway
Aggregate multiple MCP servers into a single endpoint with web UI, OAuth 2.1, and profile-based tool management
qarinai
Create unlimited AI chatbot agents for your website — powered by OpenAI-compatible LLMs, RAG, and MCP.