openai
OpenAI Code Assistant Model Context Protocol (MCP) Server
claude mcp add --transport stdio arthurcolle-openai-mcp python cli.py serve \ --env OPENAI_ORG="your_openai_organization_id_here" \ --env OPENAI_MODEL="gpt-4o" \ --env OPENAI_API_KEY="your_openai_api_key_here"
How to use
This MCP server provides a Model Context Protocol (MCP) implementation that enables multi-provider access (OpenAI and other LLM providers) via a single MCP server interface. It supports a rich tool suite for software development tasks, including file operations, command execution, and web-style tool interactions, all while offering real-time visualization, cost tracking, and memory/context management. You can operate the server from the command line in MCP server mode and connect clients like Claude Desktop or other MCP-enabled tools to coordinate multi-agent workflows, load-balance requests, and share context across clients.
To use the server, first ensure your environment contains valid API keys for the providers you intend to use (e.g., OpenAI). Start the MCP server with the provided entry point, then connect MCP clients to the running server. From there, you can issue queries, manage tool executions, and leverage the multi-agent coordination features to break down complex tasks across specialized agents. The built-in tools (View, Edit, Replace, GlobTool, GrepTool, LS, Bash) allow you to manipulate files and run commands within the same session, while the MCP layer handles context sharing and synchronization across clients.
How to install
Prerequisites:
- Python 3.10+ installed on your system
- Git to clone the repository
- Internet access to install dependencies from requirements.txt
Install steps:
- Clone the repository
git clone https://github.com/arthurcolle/openai-mcp-repo.git
cd openai-mcp-repo
- Create and activate a Python virtual environment (recommended)
python -m venv venv
source venv/bin/activate # On Windows use: venv\Scripts\activate
- Install dependencies
pip install -r requirements.txt
- Create a .env file with your API keys and optional model selections
# Choose one or more providers
OPENAI_API_KEY=your_openai_api_key_here
ANTHROPIC_API_KEY=your_anthropic_api_key_here # optional
# Optional model selection
OPENAI_MODEL=gpt-4o
ANTHROPIC_MODEL=claude-3-opus-20240229 # optional
- Start the MCP server in MCP mode
# Run in MCP server mode (OpenAI-focused MCP server)
python cli.py serve
- If you need development or alternate entry points, you can adjust the command accordingly to match your setup. Ensure any required environment variables are available to the server process.
Additional notes
Tips and notes:
- Ensure your OpenAI API key is correctly exported or placed in a .env file that is loaded by your environment.
- If you use multiple providers, you may want to set provider-specific model parameters via the ENV vars shown in the example.
- The MCP server supports real-time tool visualization; monitor stdout/stderr or the MCP UI for progress indicators during tool execution.
- If you encounter connection issues from MCP clients, verify host/port bindings and that the server process is reachable from the client network.
- Use the --env-file option (if supported by your setup) to load credentials from a file instead of manually exporting environment variables.
- For cost management, ensure your usage is routed through the cost/tracking modules and configure budgets via the client commands as needed.
Related MCP Servers
mcp-vegalite
MCP server from isaacwasserman/mcp-vegalite-server
github-chat
A Model Context Protocol (MCP) for analyzing and querying GitHub repositories using the GitHub Chat API.
nautex
MCP server for guiding Coding Agents via end-to-end requirements to implementation plan pipeline
pagerduty
PagerDuty's official local MCP (Model Context Protocol) server which provides tools to interact with your PagerDuty account directly from your MCP-enabled client.
futu-stock
mcp server for futuniuniu stock
mcp -boilerplate
Boilerplate using one of the 'better' ways to build MCP Servers. Written using FastMCP