mcp s
All About AI MCP Servers
claude mcp add --transport stdio allaboutai-yt-mcp-servers flux-server \ --env REPLICATE_API_TOKEN="your-replicate-token"
How to use
This MCP server set provides two integrations: an OpenAI o1 MCP Server and a Flux MCP Server. The OpenAI server exposes an interface to interact with OpenAI's o1 preview model through the MCP protocol, with support for streaming responses, adjustable temperature and top_p, and configurable system messages to steer behavior. The Flux server integrates Flux capabilities, enabling access to Flux-powered features and image-related capabilities via MCP, authenticated by a Replicate API token. To use them, supply the required environment variables (such as OPENAI_API_KEY for the OpenAI server and REPLICATE_API_TOKEN for the Flux server) and run each server independently. You can start both servers and then issue MCP requests to their endpoints using the canonical MCP protocol payloads defined by your tooling or client library.
How to install
Prerequisites:
- Git installed on your system
- Access tokens for the respective services (OPENAI_API_KEY for OpenAI o1, REPLICATE_API_TOKEN for Flux)
- A system capable of running the required server binaries (openai-server and flux-server)
Steps:
-
Clone the repository containing the MCP servers git clone https://github.com/AllAboutAI-YT/mcp-servers.git cd mcp-servers
-
Create an MCP configuration file (as shown in mcp_config) to define how each server runs. Save it as mcp.config.json, for example: { "mcpServers": { "openai": { "command": "openai-server", "env": { "OPENAI_API_KEY": "your-openai-api-key" } }, "flux": { "command": "flux-server", "env": { "REPLICATE_API_TOKEN": "your-replicate-token" } } } }
-
Provide the necessary environment variables in a .env file or in the process environment. Example .env: OPENAI_API_KEY=your-openai-api-key REPLICATE_API_TOKEN=your-replicate-token
-
Start the servers using the commands defined in the configuration. For example:
- Start OpenAI o1 MCP Server: openai-server
- Start Flux MCP Server: flux-server
Note: If you are using a containerized or package manager-based workflow, adapt these commands to your environment (e.g., npm, pipx, or docker) as appropriate for how the openai-server and flux-server binaries are distributed in your setup.
Additional notes
Tips and considerations:
- Keep your API tokens secure; use environment variables or secret management where possible.
- Verify that network access to OpenAI and Replicate endpoints is allowed from your hosting environment.
- If streaming is enabled, ensure your MCP client properly handles streaming responses.
- Document and protect any system messages used to steer model behavior to avoid unsafe or unintended interactions.
- When debugging, start one server at a time to confirm proper environment variable handling and connectivity before running both concurrently.
Related MCP Servers
zen
Selfhosted notes app. Single golang binary, notes stored as markdown within SQLite, full-text search, very low resource usage
MCP -Deepseek_R1
A Model Context Protocol (MCP) server implementation connecting Claude Desktop with DeepSeek's language models (R1/V3)
mcp-fhir
A Model Context Protocol implementation for FHIR
mcp
Inkdrop Model Context Protocol Server
mcp-appium-gestures
This is a Model Context Protocol (MCP) server providing resources and tools for Appium mobile gestures using Actions API..
dubco -npm
The (Unofficial) dubco-mcp-server enables AI assistants to manage Dub.co short links via the Model Context Protocol. It provides three MCP tools: create_link for generating new short URLs, update_link for modifying existing links, and delete_link for removing short links.