Get the FREE Ultimate OpenClaw Setup Guide →

mcp s

All About AI MCP Servers

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio allaboutai-yt-mcp-servers flux-server \
  --env REPLICATE_API_TOKEN="your-replicate-token"

How to use

This MCP server set provides two integrations: an OpenAI o1 MCP Server and a Flux MCP Server. The OpenAI server exposes an interface to interact with OpenAI's o1 preview model through the MCP protocol, with support for streaming responses, adjustable temperature and top_p, and configurable system messages to steer behavior. The Flux server integrates Flux capabilities, enabling access to Flux-powered features and image-related capabilities via MCP, authenticated by a Replicate API token. To use them, supply the required environment variables (such as OPENAI_API_KEY for the OpenAI server and REPLICATE_API_TOKEN for the Flux server) and run each server independently. You can start both servers and then issue MCP requests to their endpoints using the canonical MCP protocol payloads defined by your tooling or client library.

How to install

Prerequisites:

  • Git installed on your system
  • Access tokens for the respective services (OPENAI_API_KEY for OpenAI o1, REPLICATE_API_TOKEN for Flux)
  • A system capable of running the required server binaries (openai-server and flux-server)

Steps:

  1. Clone the repository containing the MCP servers git clone https://github.com/AllAboutAI-YT/mcp-servers.git cd mcp-servers

  2. Create an MCP configuration file (as shown in mcp_config) to define how each server runs. Save it as mcp.config.json, for example: { "mcpServers": { "openai": { "command": "openai-server", "env": { "OPENAI_API_KEY": "your-openai-api-key" } }, "flux": { "command": "flux-server", "env": { "REPLICATE_API_TOKEN": "your-replicate-token" } } } }

  3. Provide the necessary environment variables in a .env file or in the process environment. Example .env: OPENAI_API_KEY=your-openai-api-key REPLICATE_API_TOKEN=your-replicate-token

  4. Start the servers using the commands defined in the configuration. For example:

    • Start OpenAI o1 MCP Server: openai-server
    • Start Flux MCP Server: flux-server

Note: If you are using a containerized or package manager-based workflow, adapt these commands to your environment (e.g., npm, pipx, or docker) as appropriate for how the openai-server and flux-server binaries are distributed in your setup.

Additional notes

Tips and considerations:

  • Keep your API tokens secure; use environment variables or secret management where possible.
  • Verify that network access to OpenAI and Replicate endpoints is allowed from your hosting environment.
  • If streaming is enabled, ensure your MCP client properly handles streaming responses.
  • Document and protect any system messages used to steer model behavior to avoid unsafe or unintended interactions.
  • When debugging, start one server at a time to confirm proper environment variable handling and connectivity before running both concurrently.

Related MCP Servers

Sponsor this space

Reach thousands of developers