any-chat-completions
MCP Server for using any LLM as a Tool
claude mcp add --transport stdio pyroprompts-any-chat-completions-mcp npx -y @pyroprompts/any-chat-completions-mcp \ --env AI_CHAT_KEY="OPENAI_KEY" \ --env AI_CHAT_NAME="OpenAI" \ --env AI_CHAT_MODEL="gpt-4o" \ --env AI_CHAT_BASE_URL="https://api.openai.com/v1"
How to use
This MCP server implements a relay that connects any OpenAI SDK compatible Chat Completions API to a configured AI chat provider. It exposes a single tool named chat which forwards user prompts to the chosen backend (for example OpenAI, PyroPrompts, Perplexity, etc.) and returns the model's chat completion. You can run the MCP server via npx using the documented package, and configure the environment to specify which provider to use, including API keys, model, and base URL. To enable multiple providers, duplicate the MCP server entry with different environment variables (AI_CHAT_NAME, AI_CHAT_MODEL, AI_CHAT_BASE_URL, and AI_CHAT_KEY). Each configured provider will appear as its own chat tool in Claude Desktop or LibreChat, allowing you to switch between providers by selecting the corresponding environment configuration. The server handles the request/response cycle and returns the AI-generated chat completion in the MCP format expected by OpenAI-compatible clients.
How to install
Prerequisites:
- Node.js and npm installed on your system
- Access to the internet to fetch the MCP package
Installation steps:
-
Install dependencies for the MCP server project (if you are cloning the repo): npm install
-
Build the server (TypeScript): npm run build
-
For development with auto-rebuild during active work: npm run watch
-
Run or configure in Claude Desktop or LibreChat using the provided mcpServers configuration (see mcp_config). Ensure you set the environment variables per provider, such as AI_CHAT_KEY, AI_CHAT_NAME, AI_CHAT_MODEL, and AI_CHAT_BASE_URL. If you are not using npx, you can also reference a local build via node with the built index.js path as described in the README.
Optional: Use Smithery for automated installation: npx -y @smithery/cli install any-chat-completions-mcp-server --client claude
Additional notes
Tips and caveats:
- MCP servers communicate over stdio; debugging can be challenging. Consider using the MCP Inspector for debugging and monitoring (npm run inspector).
- Ensure API keys and base URLs are correctly set in environment variables for each provider you configure.
- When using multiple providers, keep their env configurations separate and reference the same MCP server package multiple times with different env blocks.
- On macOS and Windows, Claude Desktop writes config JSON files in standard user directories; confirm the path matches your platform's expectations and that the JSON is valid.
- If you encounter build errors, verify Node.js version compatibility with the TypeScript project and confirm dependencies are installed with npm install before building.
Related MCP Servers
bitbucket
Bitbucket MCP - A Model Context Protocol (MCP) server for integrating with Bitbucket Cloud and Server APIs
time
⏰ Time MCP Server: Giving LLMs Time Awareness Capabilities
crawlbase
Crawlbase MCP Server connects AI agents and LLMs with real-time web data. It powers Claude, Cursor, and Windsurf integrations with battle-tested web scraping, JavaScript rendering, and anti-bot protection enabling structured, live data inside your AI workflows.
unity-editor
An MCP server and client for LLMs to interact with Unity Projects
website-publisher
AI Website builder and publisher MCP. Quickly publish and deploy your AI generated code as real website URL. Support html, css, js, python etc.
xgmem
Global Memory MCP server, that manage all projects data.