chatgpt-copilot
ChatGPT Copilot Extension for Visual Studio Code
claude mcp add --transport stdio feiskyer-chatgpt-copilot node server.js
How to use
ChatGPT Copilot is a Visual Studio Code extension that brings powerful AI-assisted coding capabilities directly into your editor and exposes an MCP (Model Context Protocol) compatible interface for tool integration. With MCP, you can connect this server to your preferred tooling and provide model-augmented responses that can call tools, access local files, and perform actions beyond plain chat. The extension already supports a variety of providers and models, and the MCP integration enables you to enhance AI responses by enabling tool calls, file interactions, and custom prompts. To use the MCP-enabled features, install the ChatGPT Copilot extension in VS Code, run the MCP server (as configured in this document), and configure the extension to communicate with that server. Once connected, the AI can call your tools, fetch local context, and offer code suggestions that leverage your custom toolset.
The tools and capabilities enabled via MCP include: tool calling via prompts for models that don’t natively support tool calls, chat with files (adding and referencing local files/images in conversations), streaming responses in the VS Code sidebar, and a prompt manager to organize and reuse prompts. You can customize prompts, export conversations, and manage model/provider configurations. By wiring your own tools through MCP, you can tailor the AI’s behavior to your project’s needs, enabling actions like running local scripts, querying project metadata, or invoking domain-specific utilities directly from the chat flow.
How to install
Prerequisites:
- Node.js (LTS) and npm installed on your machine
- Visual Studio Code installed
- Basic familiarity with running Node.js applications
- Access to the ChatGPT Copilot MCP server (this repository)
- Install prerequisites
- Install Node.js from https://nodejs.org/ (LTS version recommended)
- Install Visual Studio Code from https://code.visualstudio.com/
- Install the MCP server locally
-
Clone the repository (or download the release package)
git clone https://github.com/feiskyer/chatgpt-copilot.git cd chatgpt-copilot
-
Install dependencies (if applicable)
npm install
- Run the MCP server
-
Start the server (example using the provided server.js entry point)
npm start
or if a direct node entry is used
node server.js
- Configure the MCP server in your environment
- Ensure the server is reachable (default port if specified in code, otherwise port 3000/8080 is common)
- If you use a configuration file, set the appropriate MCP server entry to point to your running instance
- Install and configure the VS Code extension
- Open VS Code and install the ChatGPT Copilot extension from the Visual Studio Marketplace
- In the extension settings, point the MCP client to your running server (use the host/port defined by your server)
- Reload VS Code to establish the MCP connection
Additional notes
Tips and common issues:
- Ensure the MCP server starts successfully before launching VS Code; the extension will attempt to connect on load.
- If you run behind a proxy or firewall, configure network access for the MCP server and VS Code.
- If you see authentication or API key errors in the extension, verify your API provider configuration in the extension settings.
- When using MCP, you can implement custom tool calls by exposing endpoints or scripts in your server that the AI can trigger via tool invocation prompts.
- Monitor server logs for tool invocation errors and ensure any referenced tools or scripts have proper permissions and are executable on your platform.
Related MCP Servers
k8m
一款轻量级、跨平台的 Mini Kubernetes AI Dashboard,支持大模型+智能体+MCP(支持设置操作权限),集成多集群管理、智能分析、实时异常检测等功能,支持多架构并可单文件部署,助力高效集群管理与运维优化。
openapi
OpenAPI definitions, converters and LLM function calling schema composer.
ncp
Natural Context Provider (NCP). Your MCPs, supercharged. Find any tool instantly, load on demand, run on schedule, ready for any client. Smart loading saves tokens and energy.
autoteam
Orchestrate AI agents with YAML-driven workflows via universal Model Context Protocol (MCP)
mcp-chat-studio
A powerful MCP testing tool with multi-provider LLM support (Ollama, OpenAI, Claude, Gemini). Test, debug, and develop MCP servers with a modern UI.
gemini-webapi
MCP server for Google Gemini — free image generation, editing & chat via browser cookies. No API keys needed.