Get the FREE Ultimate OpenClaw Setup Guide →

rclone

Flagged

{"isSafe":false,"isSuspicious":true,"riskLevel":"medium","findings":[{"category":"shell_command","severity":"high","description":"Curl piped to shell - curl https://rclone.org/install.sh | sudo bash. This is a remote code execution pattern that executes a downloaded script with root privileges, which can be dangerous if the install script is compromised or replaced.","evidence":"curl https://rclone.org/install.sh | sudo bash"},{"category":"other","severity":"low","description":"Examples include hard-coded credential placeholders (YOUR_ACCESS_KEY, YOUR_SECRET_KEY). Documenting credentials in examples can lead to insecure practices if users replace them with real keys in code/configs. Recommend using environment variables or tightly scoped credentials instead.","evidence":"access_key_id=YOUR_ACCESS_KEY\nsecret_access_key=YOUR_SECRET_KEY"}],"summary":"The content is largely safe for typical use, but it includes a high-risk installation pattern (curl ... | sudo bash) that can execute remote code with elevated privileges. Also, the documentation shows credential placeholders, which could encourage insecure handling if users replace them with real keys. Safer alternatives include using package managers or verified install methods and masking/avoiding hard-coded credentials."}

npx machina-cli add skill everyinc/compound-engineering-plugin/rclone --openclaw
Files (1)
SKILL.md
3.8 KB

rclone File Transfer Skill

Setup Check (Always Run First)

Before any rclone operation, verify installation and configuration:

# Check if rclone is installed
command -v rclone >/dev/null 2>&1 && echo "rclone installed: $(rclone version | head -1)" || echo "NOT INSTALLED"

# List configured remotes
rclone listremotes 2>/dev/null || echo "NO REMOTES CONFIGURED"

If rclone is NOT installed

Guide the user to install:

# macOS
brew install rclone

# Linux (script install)
curl https://rclone.org/install.sh | sudo bash

# Or via package manager
sudo apt install rclone  # Debian/Ubuntu
sudo dnf install rclone  # Fedora

If NO remotes are configured

Walk the user through interactive configuration:

rclone config

Common provider setup quick reference:

ProviderTypeKey Settings
AWS S3s3access_key_id, secret_access_key, region
Cloudflare R2s3access_key_id, secret_access_key, endpoint (account_id.r2.cloudflarestorage.com)
Backblaze B2b2account (keyID), key (applicationKey)
DigitalOcean Spacess3access_key_id, secret_access_key, endpoint (region.digitaloceanspaces.com)
Google DrivedriveOAuth flow (opens browser)
DropboxdropboxOAuth flow (opens browser)

Example: Configure Cloudflare R2

rclone config create r2 s3 \
  provider=Cloudflare \
  access_key_id=YOUR_ACCESS_KEY \
  secret_access_key=YOUR_SECRET_KEY \
  endpoint=ACCOUNT_ID.r2.cloudflarestorage.com \
  acl=private

Example: Configure AWS S3

rclone config create aws s3 \
  provider=AWS \
  access_key_id=YOUR_ACCESS_KEY \
  secret_access_key=YOUR_SECRET_KEY \
  region=us-east-1

Common Operations

Upload single file

rclone copy /path/to/file.mp4 remote:bucket/path/ --progress

Upload directory

rclone copy /path/to/folder remote:bucket/folder/ --progress

Sync directory (mirror, deletes removed files)

rclone sync /local/path remote:bucket/path/ --progress

List remote contents

rclone ls remote:bucket/
rclone lsd remote:bucket/  # directories only

Check what would be transferred (dry run)

rclone copy /path remote:bucket/ --dry-run

Useful Flags

FlagPurpose
--progressShow transfer progress
--dry-runPreview without transferring
-vVerbose output
--transfers=NParallel transfers (default 4)
--bwlimit=RATEBandwidth limit (e.g., 10M)
--checksumCompare by checksum, not size/time
--exclude="*.tmp"Exclude patterns
--include="*.mp4"Include only matching
--min-size=SIZESkip files smaller than SIZE
--max-size=SIZESkip files larger than SIZE

Large File Uploads

For videos and large files, use chunked uploads:

# S3 multipart upload (automatic for >200MB)
rclone copy large_video.mp4 remote:bucket/ --s3-chunk-size=64M --progress

# Resume interrupted transfers
rclone copy /path remote:bucket/ --progress --retries=5

Verify Upload

# Check file exists and matches
rclone check /local/file remote:bucket/file

# Get file info
rclone lsl remote:bucket/path/to/file

Troubleshooting

# Test connection
rclone lsd remote:

# Debug connection issues
rclone lsd remote: -vv

# Check config
rclone config show remote

Source

git clone https://github.com/everyinc/compound-engineering-plugin/blob/main/plugins/compound-engineering/skills/rclone/SKILL.mdView on GitHub

Overview

Rclone uploads, syncs, and manages files across cloud storage providers like S3, Cloudflare R2, Backblaze B2, Google Drive, and Dropbox. It streamlines moving media and documents to remote storage, or backing up local data to the cloud. Triggers include upload to S3, sync to cloud, and other file transfer requests.

How This Skill Works

Install and configure rclone, verify remotes, and then perform transfers with rclone copy for uploads or rclone sync for mirroring. Use flags like --progress and --dry-run to monitor and validate operations, and leverage chunked uploads for large files.

When to Use It

  • Uploading files such as images, videos, or documents to S3 or any S3 compatible remote storage
  • Syncing a local directory to a remote bucket to create a mirror and backups
  • Configuring AWS S3, Cloudflare R2, Backblaze B2, Google Drive, or Dropbox remotes and validating connectivity
  • Transferring large files with multipart or chunked uploads using S3 chunking
  • Previewing transfers with a dry run before performing real transfers

Quick Start

  1. Step 1: Verify rclone is installed and remotes exist (command -v rclone; rclone listremotes)
  2. Step 2: Configure remotes if needed using rclone config to add providers (AWS S3, Cloudflare R2, etc.)
  3. Step 3: Run a sample transfer like rclone copy /path/to/file.mp4 remote:bucket/path/ --progress

Best Practices

  • Verify rclone is installed and remotes are configured before transferring
  • Use --progress to monitor transfers and --dry-run to preview actions
  • For large files enable S3 chunked uploads with --s3-chunk-size and consider retries
  • Use checksum-based verification with --checksum or rclone check after transfers
  • Organize remotes with clear naming and document credentials securely

Example Use Cases

  • Upload a video to an S3 bucket using: rclone copy /path/to/file.mp4 remote:bucket/path/ --progress
  • Upload a directory to a remote bucket using: rclone copy /path/to/folder remote:bucket/folder/ --progress
  • Mirror a local folder to remote with: rclone sync /local/path remote:bucket/path/ --progress
  • List contents of the remote bucket with: rclone ls remote:bucket/
  • Dry-run a transfer to preview changes with: rclone copy /path remote:bucket/ --dry-run

Frequently Asked Questions

Add this skill to your agents
Sponsor this space

Reach thousands of developers