icarus-cdk
Build MCP (Model Context Protocol) servers that run as Internet Computer canisters.
claude mcp add --transport stdio galenoshea-icarus-cdk docker run -i ghcr.io/galenoshea/icarus-cdk:latest
How to use
Icarus CDK is an MCP server that enables you to build AI tools with persistent state on the Internet Computer. The server combines the MCP protocol with ICP canister orchestration to provide tools that retain memory across sessions, are globally accessible, and can interact with external data sources via HTTP outcalls. With Icarus, you can define tools (functions) in a Rust-based toolchain, expose them via a canister, and have clients such as Claude, ChatGPT, or other AI agents call these tools while benefiting from built-in authentication and stable storage.
To use the server, you typically deploy the MCP canister to ICP using the provided CLI workflows, then connect AI clients to the deployed canister. The project emphasizes features like persistent storage, HTTP outcalls for fetching external data, autonomous timers for scheduling tasks, and built-in security through Internet Identity and candid interfaces. The included examples illustrate how to define tools, enable tool discovery, and export the candid interface for client generation. Clients can interact with the MCP tools through the generated interface, and we provide tooling to manage canisters, start the MCP server for testing, and operate in foreground or daemon modes depending on your deployment needs.
How to install
Prerequisites:
- Rust toolchain (cargo) installed: https://rust-lang.org/
- Docker (for the recommended container-based run): https://www.docker.com/
- Access to the Internet Computer (ICP) network and a wallet/identity configured for deployment
Option A — Install and run locally (Rust tooling and CLI required)
-
Install the Icarus CLI and dependencies (example commands): cargo install icarus-cli
-
Create a new MCP project (if starting from scratch): icarus new my-icarus-tool cd my-icarus-tool
-
Build and deploy to ICP according to the Icarus workflow in your project docs: icarus deploy
-
Start the MCP server locally for testing (in foreground): icarus mcp start <canister-id>
Option B — Run via Docker (recommended for ease of setup)
- Ensure Docker is installed and running.
- Run the MCP server container (example): docker run -it --rm ghcr.io/galenoshea/icarus-cdk:latest
- Follow the on-screen prompts or use your deployment scripts to connect to ICP.
Option C — Using a prebuilt Docker image in production
- Pull and run the image with your deployment configuration:
docker run -d --name icarus-cdk
-e IC_IDENTITY=<identity>
-e ICP_NETWORK=<network>
ghcr.io/galenoshea/icarus-cdk:latest - Use your MCP client tooling to manage canister deployment and tool discovery as per the project docs.
Additional notes
Tips and common considerations:
- Ensure your ICP identity has the required permissions for canister deployment and management.
- If you use Docker, you may need to expose or mount identity and network configuration as environment variables.
- The MCP server relies on persistent storage; ensure your deployment environment provides stable storage (canister state on ICP).
- Review the README’s examples to understand how to define tools, enable tool discovery, and export the candid interface for clients.
- When upgrading, verify compatibility of tool macros and CAC (canister) interface versions to avoid breaking changes in client code.
Related MCP Servers
fastmcp
🚀 The fast, Pythonic way to build MCP servers and clients.
mcp-agent
Build effective agents using Model Context Protocol and simple workflow patterns
nexus
Govern & Secure your AI
hypertool
Dynamically expose tools from proxied servers based on an Agent Persona
MediaWiki
Model Context Protocol (MCP) Server to connect your AI with any MediaWiki
rs-utcp
Official Rust implementation of the UTCP