gaianet-node
Install, run and deploy your own decentralized AI agent service
claude mcp add --transport stdio gaianet-ai-gaianet-node node path/to/server.js
How to use
GaiaNet Node provides an out-of-the-box AI agent node that you can run locally to host and interact with a conversational AI model. The node downloads models, sets up a vector database, and exposes a web interface for chatting with the AI agent. Use the quick start flow to install, initialize, and run the node, then navigate to the provided node URL to interact with the AI. The system supports commands to initialize with a specific config, start and stop the node, and update configuration, making it suitable for local development and experimentation with different models and knowledge bases.
Once installed, you can initialize the node with gaianet init, which downloads models, vector embeddings, and a Qdrant instance. Start the node with gaianet start to bring up the WasmEdge runtime, the QA/RAG components, and the web dashboard. You can access the node's address from the console output and open it in a browser to view node information and chat with the AI agent. The node supports stopping gracefully via gaianet stop and updating configuration with gaianet config. For advanced usage, you can pass a pre-set configuration URL to gaianet init to bootstrap the node with a particular model and knowledge base, or run gaianet start --local-only to expose services only on localhost.
How to install
Prerequisites:
- curl and bash available on your system (Linux/macOS/WSL).
- A compatible environment for running WasmEdge and Qdrant if you plan to use full features.
- Optional: Docker if you prefer containerized setup.
Install steps:
- Quick install script (default stack):
curl -sSfL 'https://github.com/GaiaNet-AI/gaianet-node/releases/latest/download/install.sh' | bash
This installs the default GaiaNet Node stack into the user’s home directory (by default under $HOME/gaianet).
- Initialize the node (downloads models, embeddings, and QA components):
gaianet init
Output will show progress downloading models, embeddings, and Qdrant setup.
- Start the node:
gaianet start
The command prints the node’s public or local URL where you can access the dashboard and chat with the AI agent.
- Optional: Stop or reconfigure:
gaianet stop
gaianet config --chat-url "https://example.com/model.gguf"
If you prefer installing in an alternative base directory, re-run the install script with the --base flag:
curl -sSfL 'https://raw.githubusercontent.com/GaiaNet-AI/gaianet-node/main/install.sh' | bash -s -- --base $HOME/gaianet.alt
Additional notes
Tips and notes:
- The node uses WasmEdge, Qdrant, and a rag-api-server as part of its stack. Ensure you have sufficient disk space for model files and embeddings.
- The init step reads configuration from $HOME/gaianet/config.json by default; you can specify a different config with --config URL or --base to point to another installation directory.
- To run locally only (no public GaiaNet domain exposure), start with gaianet start --local-only.
- If you encounter network or download failures, re-run the install or init steps to retry downloading heavy assets. Use the provided sample outputs as references to confirm successful operations.
- You can customize the initial configuration by using gaianet init --config <config.json URL> for pre-set models and knowledge bases.
Related MCP Servers
bytebot
Bytebot is a self-hosted AI desktop agent that automates computer tasks through natural language commands, operating within a containerized Linux desktop environment.
cursor-talk-to-figma
TalkToFigma: MCP integration between AI Agent (Cursor, Claude Code) and Figma, allowing Agentic AI to communicate with Figma for reading designs and modifying them programmatically.
solace-agent-mesh
An event-driven framework designed to build and orchestrate multi-agent AI systems. It enables seamless integration of AI agents with real-world data sources and systems, facilitating complex, multi-step workflows.
DeepMCPAgent
Model-agnostic plug-n-play LangChain/LangGraph agents powered entirely by MCP tools over HTTP/SSE.
bytechef
Open-source, AI-native, low-code platform for API orchestration, workflow automation, and AI agent integration across internal systems and SaaS products.
argo
ARGO is an open-source AI Agent platform that brings Local Manus to your desktop. With one-click model downloads, seamless closed LLM integration, and offline-first RAG knowledge bases, ARGO becomes a DeepResearch powerhouse for autonomous thinking, task planning, and 100% of your data stays locally. Support Win/Mac/Docker.