yomo
🦖 Serverless AI Agent Framework with Geo-distributed Edge AI Infra.
claude mcp add --transport stdio yomorun-yomo cargo run --bin yomo serve \ --env RUST_LOG="debug" \ --env RUST_BACKTRACE="1"
How to use
YoMo is a fast stream processing server written in Rust. It runs as a command-line binary that serves as a platform for serverless functions and streaming workflows. You build the project with cargo and then run the YoMo executable, which exposes a runtime capable of hosting named serverless functions (sfn) and streaming pipelines. You can deploy and invoke serverless functions by name, and you can also run streaming endpoints that continuously process data or batch-like flows. The README demonstrates how to start the server and run sample functions, such as an uppercase transformer, and how to send requests to those functions via HTTP.
To use its capabilities, start the server with the serve command, then deploy and invoke serverless functions using the provided sfn commands. For example, you can run a function named uppercase by pointing YoMo at the function directory, and you can also start an SSE (server-sent events) stream for continuous data processing. Requests are sent as JSON payloads to the /sfn/{name} endpoint, and responses reflect the function logic you’ve defined in your serverless projects. This setup enables building and testing lightweight data processing workflows with minimal orchestration.
How to install
Prerequisites:
- Rust and Cargo installed (https://www.rust-lang.org/tools/install)
- Basic familiarity with the command line
Installation steps:
-
Install Rust and Cargo if you don’t have them:
- On Windows/macOS/Linux, follow the official Rust installation guide: https://www.rust-lang.org/tools/install
-
Clone the MCP YoMo repository (or download the source):
- git clone https://github.com/your-org/yomorun-yomo.git
- cd yomorun-yomo
-
Build the project:
- cargo build
-
Run the server (as shown in the README examples):
- RUST_LOG=debug ./target/debug/yomo serve
-
Optional: run a sample sfn (serverless function):
- RUST_LOG=info ./target/debug/yomo run --name uppercase ./serverless/go/uppercase
Notes:
- Ensure port 9001 is available or configure networking as needed for your environment.
- You can adapt the serverless function paths to your own functions and languages by following the repository’s examples.
Additional notes
Tips and common issues:
- Make sure to set RUST_LOG to control log verbosity (debug for development, info for normal operation).
- The HTTP endpoint for serverless functions is typically http://127.0.0.1:9001/sfn/{function-name}; adjust as needed for your deployment.
- SSE stream example shows how to subscribe to a stream function; ensure your client supports EventSource or equivalent.
- If you modify serverless functions, re-run the server or rebuild to pick up changes.
- For production, consider configuring TLS termination and proper port exposure, and explore environment variable options in your deployment setup.
Related MCP Servers
claude-talk-to-figma
A Model Context Protocol (MCP) that allows Claude Desktop and other AI tools (Claude Code, Cursor, Antigravity, etc.) to read, analyze, and modify Figma designs
google-ai-mode
MCP server for free Google AI Mode search with citations. Query optimization, CAPTCHA handling, multi-agent support. Works with Claude Code, Cursor, Cline, Windsurf.
crawl4ai
🕷️ A lightweight Model Context Protocol (MCP) server that exposes Crawl4AI web scraping and crawling capabilities as tools for AI agents. Similar to Firecrawl's API but self-hosted and free. Perfect for integrating web scraping into your AI workflows with OpenAI Agents SDK, Cursor, Claude Code, and other MCP-compatible tools.
omega-memory
Persistent memory for AI coding agents
mcp-install-instructions-generator
Generate MCP Server Installation Instructions for Cursor, Visual Studio Code, Claude Code, Claude Desktop, Windsurf, ChatGPT, Gemini CLI and more
statelessagent
Your AI forgets everything between sessions. SAME fixes that. Local-first, no API keys, single binary.