mcp
MCP server from WaveSpeedAI/mcp-server
claude mcp add --transport stdio wavespeedai-mcp-server python -m wavespeed_mcp \ --env WAVESPEED_API_KEY="your-api-key-here" \ --env WAVESPEED_API_HOST="https://api.wavespeed.ai" \ --env WAVESPEED_LOG_FILE="Optional log file path (e.g., /tmp/wavespeed-mcp.log)" \ --env WAVESPEED_LOG_LEVEL="INFO" \ --env WAVESPEED_MCP_BASE_PATH="path/to/save/generated/files" \ --env WAVESPEED_REQUEST_TIMEOUT="per HTTP request timeout in seconds (default: 300)" \ --env WAVESPEED_API_RESOURCE_MODE="url|local|base64" \ --env WAVESPEED_API_VIDEO_ENDPOINT="default: /wavespeed-ai/wan-2.1/i2v-480p-lora" \ --env WAVESPEED_WAIT_RESULT_TIMEOUT="total wait for result in seconds (default: 600)" \ --env WAVESPEED_API_TEXT_TO_IMAGE_ENDPOINT="default: /wavespeed-ai/flux-dev" \ --env WAVESPEED_API_IMAGE_TO_IMAGE_ENDPOINT="default: /wavespeed-ai/flux-kontext-pro"
How to use
WavespeedMCP exposes WaveSpeed AI’s image and video generation capabilities via the MCP protocol. The server is driven through a Python CLI entry point (wavespeed_mcp) and supports environment-driven configuration as well as CLI arguments for API keys, hosts, and file paths. You can use the MCP interface to submit text-to-image, image-to-image, inpainting, and LoRA-enhanced generations, and then poll for progress and retrieve results in supported output modes (URL, local file, or base64). The server includes a modular tool definition set, optimized polling with progress tracking, and structured error handling to help integrate WaveSpeed’s capabilities into other apps or IDEs (e.g., Claude Desktop integration).
How to install
Prerequisites:
- Python 3.11+
- WaveSpeed API key (obtain from WaveSpeed AI)
Install the package from PyPI:
pip install wavespeed-mcp
Optionally, install in editable mode for development:
pip install -e ".[dev]"
Configure and run the server (example):
# Export your API key as an environment variable (recommended)
export WAVESPEED_API_KEY=your_api_key_here
# Run the MCP server (uses the wavespeed_mcp CLI entry point)
wavespeed_mcp --api-key $WAVESPEED_API_KEY
If you prefer to invoke via Python module:
python -m wavespeed_mcp --api-key your_api_key_here
Optional: prepare a configuration file and reference it with --config if supported by your setup.
Additional notes
Environment variables provide flexible runtime configuration. Common variables include WAVESPEED_API_KEY, WAVESPEED_API_HOST, WAVESPEED_MCP_BASE_PATH, and WAVESPEED_LOG_FILE. Timeouts are configurable via WAVESPEED_REQUEST_TIMEOUT (per HTTP call) and WAVESPEED_WAIT_RESULT_TIMEOUT (total polling time). The resource output mode can be set to url, local, or base64 via WAVESPEED_API_RESOURCE_MODE. If you encounter connection or authentication issues, verify the API key scope and host URL. For Claude Desktop integration, generate the required configuration using the provided CLI and then start WavespeedMCP before launching Claude Desktop. Logs can be directed to a file by setting WAVESPEED_LOG_FILE; the log format includes timestamps, source, level, and messages.
Related MCP Servers
mcp-vegalite
MCP server from isaacwasserman/mcp-vegalite-server
github-chat
A Model Context Protocol (MCP) for analyzing and querying GitHub repositories using the GitHub Chat API.
nautex
MCP server for guiding Coding Agents via end-to-end requirements to implementation plan pipeline
pagerduty
PagerDuty's official local MCP (Model Context Protocol) server which provides tools to interact with your PagerDuty account directly from your MCP-enabled client.
futu-stock
mcp server for futuniuniu stock
mcp -boilerplate
Boilerplate using one of the 'better' ways to build MCP Servers. Written using FastMCP