Get the FREE Ultimate OpenClaw Setup Guide →
m

Hippius Storage

Scanned

@maxquick

npx machina-cli add skill @maxquick/hippius --openclaw
Files (1)
SKILL.md
3.5 KB

Hippius Storage

Hippius is decentralized cloud storage on Bittensor SN75 with S3-compatible API.

Recommended path: S3 endpoint (s3.hippius.com) — the public IPFS node is deprecated.

Quick Reference

KeyValue
S3 Endpointhttps://s3.hippius.com
S3 Regiondecentralized
Access Key Formathip_xxxxxxxxxxxx
Consoleconsole.hippius.com
Python CLIpip install hippius (requires self-hosted IPFS node)

Setup

  1. Get S3 credentials from console.hippius.com → Settings → API Keys
  2. Set environment variables:
    export HIPPIUS_S3_ACCESS_KEY="hip_your_access_key"
    export HIPPIUS_S3_SECRET_KEY="your_secret_key"
    
  3. Test: aws --endpoint-url https://s3.hippius.com --region decentralized s3 ls

Common Operations

Upload

aws --endpoint-url https://s3.hippius.com --region decentralized \
    s3 cp <file> s3://<bucket>/<key>

Download

aws --endpoint-url https://s3.hippius.com --region decentralized \
    s3 cp s3://<bucket>/<key> <local_path>

List buckets

aws --endpoint-url https://s3.hippius.com --region decentralized s3 ls

List objects

aws --endpoint-url https://s3.hippius.com --region decentralized s3 ls s3://<bucket>/ --recursive

Create bucket

aws --endpoint-url https://s3.hippius.com --region decentralized s3 mb s3://<bucket>

Sync directory

aws --endpoint-url https://s3.hippius.com --region decentralized \
    s3 sync ./local-dir/ s3://<bucket>/remote-dir/

Python (boto3)

import boto3
import os

s3 = boto3.client(
    's3',
    endpoint_url='https://s3.hippius.com',
    aws_access_key_id=os.environ['HIPPIUS_S3_ACCESS_KEY'],
    aws_secret_access_key=os.environ['HIPPIUS_S3_SECRET_KEY'],
    region_name='decentralized'
)

# Upload
s3.upload_file('local.txt', 'my-bucket', 'remote.txt')

# Download
s3.download_file('my-bucket', 'remote.txt', 'downloaded.txt')

# List
for obj in s3.list_objects_v2(Bucket='my-bucket').get('Contents', []):
    print(f"{obj['Key']} ({obj['Size']} bytes)")

Scripts

  • scripts/query_storage.py — Query S3 buckets/objects and RPC account info

Usage:

# List S3 buckets
python scripts/query_storage.py --s3-buckets

# List objects in bucket
python scripts/query_storage.py --s3-objects my-bucket

# Query blockchain credits (requires account address)
python scripts/query_storage.py --account 5Grwva... --credits

References

  • references/storage_guide.md — S3 vs IPFS comparison, code examples (Python, JS)
  • references/cli_commands.mdhippius CLI reference (requires self-hosted IPFS node)

Troubleshooting

"Public store.hippius.network has been deprecated" Use S3 instead. The hippius CLI's IPFS commands require a self-hosted IPFS node.

S3 auth errors

  • Access key must start with hip_
  • Region must be decentralized (not us-east-1)
  • Endpoint must be https://s3.hippius.com

External Links

Source

git clone https://clawhub.ai/maxquick/hippiusView on GitHub

Overview

Hippius Storage provides decentralized cloud storage on Bittensor Subnet 75 with an S3-compatible API. Use the S3 endpoint s3.hippius.com to upload, query, and manage buckets and files, and compare with IPFS options. Credentials are configured via the Hippius Console.

How This Skill Works

Hippius exposes an S3-compatible interface that you interact with just like AWS S3. Obtain access and secret keys from the Hippius Console, export them as HIPPIUS_S3_ACCESS_KEY and HIPPIUS_S3_SECRET_KEY, and perform operations against https://s3.hippius.com with region decentralized. You can use the AWS CLI, Python boto3, or the provided scripts to list, upload, download, and sync data.

When to Use It

  • You need to upload files to Hippius storage from an application or local machine.
  • You want to check storage status or inventory by listing buckets and objects.
  • You must set up Hippius credentials and validate access with the endpoint.
  • You need to sync a local directory to a Hippius bucket or automate transfers.
  • You are comparing IPFS-based storage vs S3-compatible storage and deciding which to use.

Quick Start

  1. Step 1: Get S3 credentials from console.hippius.com → Settings → API Keys.
  2. Step 2: Set environment variables: export HIPPIUS_S3_ACCESS_KEY="hip_your_access_key"; export HIPPIUS_S3_SECRET_KEY="your_secret_key".
  3. Step 3: Test access with: aws --endpoint-url https://s3.hippius.com --region decentralized s3 ls

Best Practices

  • Use an S3 endpoint key format hip_xxxxxxxxxxxx by starting access keys with hip_.
  • Always target the S3 endpoint https://s3.hippius.com and use the decentralized region.
  • Store credentials securely; export HIPPIUS_S3_ACCESS_KEY and HIPPIUS_S3_SECRET_KEY in a safe environment.
  • Validate bucket and object paths before operations to avoid accidental overwrites.
  • Leverage the Python boto3 example or scripts/query_storage.py to manage buckets, list objects, and verify credits.

Example Use Cases

  • Upload a local file to a bucket: aws --endpoint-url https://s3.hippius.com --region decentralized s3 cp local.txt s3://my-bucket/remote.txt
  • List all buckets: aws --endpoint-url https://s3.hippius.com --region decentralized s3 ls
  • List objects in a bucket recursively: aws --endpoint-url https://s3.hippius.com --region decentralized s3 ls s3://my-bucket/ --recursive
  • Sync a local directory: aws --endpoint-url https://s3.hippius.com --region decentralized s3 sync ./local-dir/ s3://my-bucket/remote-dir/
  • Query storage with Python: python scripts/query_storage.py --s3-buckets --s3-objects my-bucket

Frequently Asked Questions

Add this skill to your agents
Sponsor this space

Reach thousands of developers