Get the FREE Ultimate OpenClaw Setup Guide →

ASI

Android ✧ Gemma Integration into Android System Intelligence

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio vneel-code-asi python -m asi.server \
  --env ASI_DATA_DIR="path/to/data (default: app data directory)" \
  --env ASI_LOG_LEVEL="INFO (debug for verbose logs)"

How to use

ASI implements an on-device AI coordination server that exposes a local MCP endpoint for other agents to communicate with Gemma 3n and the on-device tools. The server coordinates the Gemma-based model (Gemma 3n) running on Android and provides interfaces for tool use, memory logging, and diary-style reflections. With this MCP server running locally, you can orchestrate agents, perform on-device searches, launch apps, access system information, and manage persistent memories without leaving the device. When interacting with ASI, you’re effectively talking to a local, sovereign AI stack that can operate offline and still coordinate with cloud agents if you choose to enable online coordination.

To use it, start the ASI MCP server via the configured command. Other MCP clients can then connect to the local endpoint exposed by ASI to request agent actions, retrieve diary/memory entries, or trigger on-device tools (web search, app launches, clipboard access, alarms, etc.). The Gemma integration is exposed through the MCP interface, so agents can request on-device perception (images, audio) and grounding context to reason and respond within the local context of your device.

How to install

Prerequisites:

  • Android device with compatible Gemma 3n on-device model support
  • Python environment on-device or appropriate runtime to execute the MCP server
  • Access to the ASI repository and any required Android permissions for tool access
  1. Install dependencies
  • Ensure Python is available on the device or install via the platform’s package manager
  • Install any required Python packages listed for the ASI server (if a requirements file exists in the repo)
  1. Configure environment
  • Set up ASI_DATA_DIR to a writable directory on the device for memory logs and diaries
  • Optionally set ASI_LOG_LEVEL to DEBUG for verbose logs during setup
  1. Run the MCP server
  • From the device, execute the server as defined in the MCP config: python -m asi.server
  • Verify the server starts and binds to the expected local MCP endpoint
  1. Connect MCP clients
  • Point your MCP clients to the local endpoint exposed by ASI (as defined in your environment or client configuration)
  • Authenticate if your setup uses any local-wide security tokens or ACLs
  1. Validate operation
  • Send a simple command to verify tool access (e.g., trigger a web search or a diary entry)
  • Check logs for any runtime errors and confirm that Gemma 3n on-device inference runs without overheating or throttling

Additional notes

Tips and common issues:

  • Ensure the device has sufficient thermal headroom when running on-device inference; monitor the thermal state and adjust the inference rate if needed
  • Keep Gemma 3n updated to benefit from improvements in on-device reasoning and tool integration
  • If you enable online coordination with cloud agents, ensure network permissions and battery optimizations allow background activity
  • Use ASI_LOG_LEVEL=DEBUG temporarily if you encounter issues; revert to INFO for normal operation to reduce log noise
  • Ensure the ASI_DATA_DIR has enough space for diary/memory entries and perception logs
  • If the MCP endpoint cannot be reached, verify that the Python module asi.server is present and that the device’s IPC permissions are not restricted by the OS
  • For privacy-conscious setups, confirm that all data remains on-device and that any optional cloud components are explicitly disabled

Related MCP Servers

Sponsor this space

Reach thousands of developers