mcp -openai
Query OpenAI models directly from Claude using MCP protocol.
claude mcp add --transport stdio pierrebrunelle-mcp-server-openai python -m src.mcp_server_openai.server \ --env PYTHONPATH="C:/path/to/your/mcp-server-openai" \ --env OPENAI_API_KEY="your-key-here"
How to use
This MCP server enables querying OpenAI models directly from Claude using the MCP protocol. It exposes an OpenAI-based endpoint through the MCP server named openai-server, which runs as a Python module (src.mcp_server_openai.server). To use it, configure your Claude Desktop MCP settings to point to this server entry, providing your OpenAI API key via the OPENAI_API_KEY environment variable. You can then send prompts to specified OpenAI models through the MCP interface, and receive responses back into Claude just like other MCP-connected services. Ensure your PYTHONPATH points to the local mcp-server-openai project so the module can be discovered, and supply a valid API key to authorize calls to OpenAI.
How to install
Prerequisites:
- Python 3.8+ installed on your system
- Git
- Access to install Python packages (pip)
Steps:
- Clone the repository: git clone https://github.com/pierrebrunelle/mcp-server-openai
- Install in editable mode (from the repository root): pip install -e .
- Set up your environment variables when running the server:
- Ensure OPENAI_API_KEY is set to your OpenAI API key
- Set PYTHONPATH to the path of your local mcp-server-openai project (as shown in the config example)
- Start or configure the MCP server entry as shown in the mcp_config example, using the Python module invocation: python -m src.mcp_server_openai.server
- In Claude Desktop, add or update the mcpServers entry to point to the above server configuration.
Notes:
- If you modify code, reinstall or use pip install -e . to pick up changes.
- Ensure network access to OpenAI APIs is allowed from the host running the server.
Additional notes
Tips and considerations:
- Securely manage your OpenAI API key; avoid committing it to version control.
- The PYTHONPATH must point to the project root so the module src.mcp_server_openai.server is importable.
- If you encounter connection or authentication errors, verify the OPENAI_API_KEY environment variable is correctly set in the runtime environment.
- Check network proxies or firewalls that may block outbound HTTPS to OpenAI endpoints.
- For testing, you can run pytest as shown in the README to validate the OpenAI call flow, but ensure you provide a valid API key to avoid test failures.
- This MCP server is designed to be used from Claude Desktop via the MCP protocol; for debugging, you can run the Python server directly and send test prompts to ensure it forwards requests correctly.
Related MCP Servers
mcp-vegalite
MCP server from isaacwasserman/mcp-vegalite-server
github-chat
A Model Context Protocol (MCP) for analyzing and querying GitHub repositories using the GitHub Chat API.
nautex
MCP server for guiding Coding Agents via end-to-end requirements to implementation plan pipeline
pagerduty
PagerDuty's official local MCP (Model Context Protocol) server which provides tools to interact with your PagerDuty account directly from your MCP-enabled client.
futu-stock
mcp server for futuniuniu stock
mcp -boilerplate
Boilerplate using one of the 'better' ways to build MCP Servers. Written using FastMCP