mcp-fal
A Model Context Protocol (MCP) server for interacting with fal.ai models and services.
claude mcp add --transport stdio am0y-mcp-fal python main.py \ --env FAL_KEY="your_fal_api_key_here"
How to use
This MCP server exposes an interface to interact with fal.ai models and services. It lets you list available models, search by keywords, retrieve model schemas, generate content with a chosen model, and manage both direct and queued executions. Additionally, you can upload files to the fal.ai CDN and monitor or retrieve results from queued requests. The API surface includes operations to list models (models), search (search), fetch a model's OpenAPI schema (schema), generate content (generate) with optional queueing, and manage queued results (result, status, cancel) as well as uploading assets (upload). The tools are designed to be used by MCP clients such as Claude Desktop, VS Code integrations, or other MCP-enabled runners, and are intended to be used via stdio transport by spawning the server as a child process.
To use it, run the server in a local environment with the appropriate Python interpreter and environment variable FAL_KEY containing your fal.ai API key. Then configure your MCP client to point to the running server, supplying the command to start the Python process and the path to main.py, along with the FAL_KEY environment variable. The server will listen to stdin/stdout and respond to the MCP client with the expected protocol messages. The API reference within the server exposes functions for model discovery, schema retrieval, generation (including queued operation modes), result/status/cancel of queued tasks, and file uploads to fal.ai CDN.
How to install
Prerequisites:
- Python 3.10+ installed
- Access to fal.ai API and an API key
- git (optional, for cloning repository)
Manual installation (recommended):
-
Clone the repository and navigate into it: git clone https://github.com/am0y/mcp-fal.git cd mcp-fal
-
Create a Python virtual environment and activate it: python -m venv venv
Windows
venv\Scripts\activate
Linux/Mac
source venv/bin/activate
-
Install dependencies: venv/Scripts/pip install -r requirements.txt # Windows
or
venv/bin/pip install -r requirements.txt # Linux/Mac
-
Set up your fal.ai API key as an environment variable (example shown): export FAL_KEY="YOUR_FAL_API_KEY_HERE"
On Windows (PowerShell)
$env:FAL_KEY = "YOUR_FAL_API_KEY_HERE"
-
Run the server directly (will wait for stdio input): venv/Scripts/python main.py # Windows venv/bin/python main.py # Linux/Mac
Docker installation (experimental):
- Clone the repo and set up environment file as described above.
- Copy the environment template and add your API key:
cp .env.example .env
Edit .env and add your fal.ai API key
- Start the server with Docker Compose: docker-compose up -d
Note: The Docker method is experimental and not suitable for MCP integration, as MCP servers require stdio transport from the client.
Additional notes
Key notes and tips:
- The MCP server expects a working fal.ai API key via the FAL_KEY environment variable. You can also reference a system-wide environment variable or a .env file if supported by your runtime.
- This server uses stdio transport. Do not run it as a background service without an active stdin connection if you plan to integrate with MCP clients such as Claude Desktop or VS Code/Antigravity.
- The API includes tools for models, search, schema retrieval, generate (with optional queueing), result/status/cancel for queued jobs, and upload to fal.ai CDN. When using queue-enabled generation, monitor status and fetch results via the provided result/status endpoints.
- If you plan to run locally with Docker, be aware that the Docker setup is not compatible with MCP client spawning behavior and is provided for experimental/educational purposes only.
- For security, consider referencing FAL_KEY via environment variable expansion in the MCP client configuration rather than hardcoding in scripts.
Related MCP Servers
mcp-vegalite
MCP server from isaacwasserman/mcp-vegalite-server
github-chat
A Model Context Protocol (MCP) for analyzing and querying GitHub repositories using the GitHub Chat API.
nautex
MCP server for guiding Coding Agents via end-to-end requirements to implementation plan pipeline
pagerduty
PagerDuty's official local MCP (Model Context Protocol) server which provides tools to interact with your PagerDuty account directly from your MCP-enabled client.
futu-stock
mcp server for futuniuniu stock
mcp -boilerplate
Boilerplate using one of the 'better' ways to build MCP Servers. Written using FastMCP