offeryn
Build tools for LLMs in Rust using Model Context Protocol
claude mcp add --transport stdio avahowell-offeryn cargo run --bin offeryn \ --env RUST_LOG="info" \ --env MCP_SERVER_NAME="offeryn" \ --env MCP_SERVER_VERSION="1.0.0"
How to use
offeryn is a Rust implementation of the Model Context Protocol (MCP) designed to expose tools written in Rust to AI agents. It provides a core MCP server with JSON-RPC for invoking tools, and transport options including Stdio and Server-Sent Events (SSE) to connect to clients such as Claude Desktop or other MCP-enabled agents. The library expects you to define tools via the mcp_tool procedural macro, which automatically generates JSON-RPC-compatible tool endpoints and corresponding JSON schemas for input validation. With these tools registered on an MCP server, you can expose a rich set of Rust-based capabilities (math operations, data processing, domain-specific logic, etc.) to agents as callable remote tools. Clients discover available tools through the generated tool metadata and invoke them using standard JSON-RPC methods, receiving results or errors in a structured format.
To use offeryn, start the server (via cargo run in the project) and connect a client to the configured transport. For the Stdio transport example, the server reads from standard input and writes to standard output, allowing integration with CLI workflows or desktop clients. For the SSE transport example, you can serve tool endpoints over HTTP as an SSE stream, enabling web-based or long-lived client connections. Tools are defined with the mcp_tool attribute, and each tool method becomes a remote-callable operation with a clear description and input schema derived from Rust types. This makes it straightforward to expose complex logic with robust type safety and automatic JSON schema generation.
How to install
Prerequisites:
- Rust toolchain (rustup, cargo)
- Basic familiarity with Rust and Cargo projects
Step 1: Install Rust
- If you don’t have Rust installed, install it from https://rustup.rs/
Step 2: Build or install the offeryn server
-
Clone the repository: git clone https://github.com/your-org/offeryn.git cd offeryn
-
Build and run locally (using cargo): cargo build --release
or run the binary directly if you have a prebuilt release
cargo run --bin offeryn
Step 3: Run with cargo (development flow)
-
Ensure dependencies are up to date: cargo update
-
Run the server: cargo run --bin offeryn
Step 4: Optional - Docker build (if you provide a Dockerfile)
- Build the image: docker build -t offeryn:latest .
- Run the container: docker run -i offeryn:latest
Notes:
- The server exposes tools via the mcp_tool attribute. Ensure your Rust tool types implement Send + Sync as required by the async runtime.
- For production, consider configuring logging (RUST_LOG) and secure transport (e.g., SSE over HTTPS or other MCP transports) as appropriate.
Additional notes
Tips and common considerations:
- Tool metadata is generated at compile time from the mcp_tool-decorated methods, including a JSON schema for inputs. Leverage this to guide client tooling and UI generation.
- If you enable the Stdio transport, wiring with CLI workflows and tools like Claude Desktop can be straightforward by piping the binary’s stdio through the MCP client.
- For SSE transport, you can expose a lightweight HTTP server that serves the MCP tool endpoints; ensure firewall and port configuration matches your deployment environment.
- Environment variables can be used to tune server behavior (e.g., MCP server name and version). Consider exposing configuration via a file or environment for easier deployment.
- If you extend with async I/O or long-running tools, ensure proper cancellation and timeout handling to avoid stalled RPCs.
- If you encounter JSON schema mismatches, verify the tool signatures and ensure types derive or implement Serialize/Deserialize as needed for JSON-RPC payloads.
Related MCP Servers
mcp-agent
Build effective agents using Model Context Protocol and simple workflow patterns
nexus
Govern & Secure your AI
neurolink
Universal AI Development Platform with MCP server integration, multi-provider support, and professional CLI. Build, test, and deploy AI applications with multiple ai providers.
mcp-protocol-sdk
[DEPRECATED] Moved to prism-mcp-rs - Enterprise-grade Rust MCP SDK
protocols-io
An MCP server that enables MCP clients like Claude Desktop to interact with data from protocols.io.
memory
A MCP (Model Context Protocol) server providing long-term memory for LLMs