Get the FREE Ultimate OpenClaw Setup Guide →

azure-storage

npx machina-cli add skill microsoft/GitHub-Copilot-for-Azure/azure-storage --openclaw
Files (1)
SKILL.md
4.6 KB

Azure Storage Services

Services

ServiceUse WhenMCP ToolsCLI
Blob StorageObjects, files, backups, static contentazure__storageaz storage blob
File SharesSMB file shares, lift-and-shift-az storage file
Queue StorageAsync messaging, task queues-az storage queue
Table StorageNoSQL key-value (consider Cosmos DB)-az storage table
Data LakeBig data analytics, hierarchical namespace-az storage fs

MCP Server (Preferred)

When Azure MCP is enabled:

  • azure__storage with command storage_account_list - List storage accounts
  • azure__storage with command storage_container_list - List containers in account
  • azure__storage with command storage_blob_list - List blobs in container
  • azure__storage with command storage_blob_get - Download blob content
  • azure__storage with command storage_blob_put - Upload blob content

If Azure MCP is not enabled: Run /azure:setup or enable via /mcp.

CLI Fallback

# List storage accounts
az storage account list --output table

# List containers
az storage container list --account-name ACCOUNT --output table

# List blobs
az storage blob list --account-name ACCOUNT --container-name CONTAINER --output table

# Download blob
az storage blob download --account-name ACCOUNT --container-name CONTAINER --name BLOB --file LOCAL_PATH

# Upload blob
az storage blob upload --account-name ACCOUNT --container-name CONTAINER --name BLOB --file LOCAL_PATH

Storage Account Tiers

TierUse CasePerformance
StandardGeneral purpose, backupMilliseconds
PremiumDatabases, high IOPSSub-millisecond

Blob Access Tiers

TierAccess FrequencyCost
HotFrequentHigher storage, lower access
CoolInfrequent (30+ days)Lower storage, higher access
ColdRare (90+ days)Lower still
ArchiveRarely (180+ days)Lowest storage, rehydration required

Redundancy Options

TypeDurabilityUse Case
LRS11 ninesDev/test, recreatable data
ZRS12 ninesRegional high availability
GRS16 ninesDisaster recovery
GZRS16 ninesBest durability

Service Details

For deep documentation on specific services:

SDK Quick References

For building applications with Azure Storage SDKs, see the condensed guides:

For full package listing across all languages, see SDK Usage Guide.

Azure SDKs

For building applications that interact with Azure Storage programmatically, Azure provides SDK packages in multiple languages (.NET, Java, JavaScript, Python, Go, Rust). See SDK Usage Guide for package names, installation commands, and quick start examples.

Source

git clone https://github.com/microsoft/GitHub-Copilot-for-Azure/blob/main/plugin/skills/azure-storage/SKILL.mdView on GitHub

Overview

Azure Storage offers Blob Storage, File Shares, Queue Storage, Table Storage, and Data Lake for scalable object storage, file sharing, asynchronous messaging, NoSQL data, and big data analytics. It supports access tiers (hot, cool, archive) and lifecycle rules to optimize cost and performance. This skill covers uploading and downloading blobs, managing storage accounts, and data lake workloads.

How This Skill Works

Azure Storage runs inside storage accounts that host multiple services: blobs, files, queues, tables, and Data Lake. Developers interact via REST, SDKs, or the az CLI; you select a service, a container/share, and perform operations such as put/get/upload/download. Data can be stored in different access tiers and redundancy configurations (LRS, ZRS, GRS, GZRS) to balance cost, durability, and availability.

When to Use It

  • Store unstructured data (images, videos, backups) in Blob Storage for scalable access and delivery.
  • Provide SMB file shares for lift-and-shift workloads using File Shares.
  • Build decoupled, asynchronous workflows with Queue Storage for task queues.
  • Implement NoSQL key-value storage with Table Storage, or analyze data with Data Lake for big data workloads.
  • Apply lifecycle rules and access tiers to optimize costs and data retention (hot/cool/archive).

Quick Start

  1. Step 1: If Azure MCP is enabled, run azure__storage storage_account_list to list storage accounts; if not, run /azure:setup or enable via /mcp.
  2. Step 2: List containers in an account: azure__storage storage_container_list or az storage container list --account-name ACCOUNT --output table.
  3. Step 3: Upload or download a blob: azure__storage storage_blob_put --container-name CONTAINER --name BLOB --file LOCAL_PATH or az storage blob upload --account-name ACCOUNT --container-name CONTAINER --name BLOB --file LOCAL_PATH; download with storage_blob_get or az storage blob download.

Best Practices

  • Choose the right Storage Account tier (Standard vs Premium) based on latency and IOPS needs.
  • Use blob access tiers (Hot, Cool, Archive) and lifecycle management to automatically move data over time.
  • Select appropriate redundancy (LRS, ZRS, GRS, GZRS) based on disaster recovery requirements.
  • Leverage Data Lake storage when hierarchical namespaces are beneficial for analytics.
  • Automate common tasks with MCP tools (azure__storage commands) or the az CLI for repeatable workflows.

Example Use Cases

  • Upload product images to Blob Storage and serve them via a CDN for a retail site.
  • Store large telemetry datasets in Data Lake for batch and interactive analytics.
  • Create SMB file shares for a development team to collaboratively edit project files.
  • Queue background processing tasks using Queue Storage to decouple components.
  • Archive old data to the Archive tier and restore when needed for compliance.

Frequently Asked Questions

Add this skill to your agents
Sponsor this space

Reach thousands of developers