azure-storage
npx machina-cli add skill microsoft/GitHub-Copilot-for-Azure/azure-storage --openclawAzure Storage Services
Services
| Service | Use When | MCP Tools | CLI |
|---|---|---|---|
| Blob Storage | Objects, files, backups, static content | azure__storage | az storage blob |
| File Shares | SMB file shares, lift-and-shift | - | az storage file |
| Queue Storage | Async messaging, task queues | - | az storage queue |
| Table Storage | NoSQL key-value (consider Cosmos DB) | - | az storage table |
| Data Lake | Big data analytics, hierarchical namespace | - | az storage fs |
MCP Server (Preferred)
When Azure MCP is enabled:
azure__storagewith commandstorage_account_list- List storage accountsazure__storagewith commandstorage_container_list- List containers in accountazure__storagewith commandstorage_blob_list- List blobs in containerazure__storagewith commandstorage_blob_get- Download blob contentazure__storagewith commandstorage_blob_put- Upload blob content
If Azure MCP is not enabled: Run /azure:setup or enable via /mcp.
CLI Fallback
# List storage accounts
az storage account list --output table
# List containers
az storage container list --account-name ACCOUNT --output table
# List blobs
az storage blob list --account-name ACCOUNT --container-name CONTAINER --output table
# Download blob
az storage blob download --account-name ACCOUNT --container-name CONTAINER --name BLOB --file LOCAL_PATH
# Upload blob
az storage blob upload --account-name ACCOUNT --container-name CONTAINER --name BLOB --file LOCAL_PATH
Storage Account Tiers
| Tier | Use Case | Performance |
|---|---|---|
| Standard | General purpose, backup | Milliseconds |
| Premium | Databases, high IOPS | Sub-millisecond |
Blob Access Tiers
| Tier | Access Frequency | Cost |
|---|---|---|
| Hot | Frequent | Higher storage, lower access |
| Cool | Infrequent (30+ days) | Lower storage, higher access |
| Cold | Rare (90+ days) | Lower still |
| Archive | Rarely (180+ days) | Lowest storage, rehydration required |
Redundancy Options
| Type | Durability | Use Case |
|---|---|---|
| LRS | 11 nines | Dev/test, recreatable data |
| ZRS | 12 nines | Regional high availability |
| GRS | 16 nines | Disaster recovery |
| GZRS | 16 nines | Best durability |
Service Details
For deep documentation on specific services:
- Blob storage patterns and lifecycle -> Blob Storage documentation
- File shares and Azure File Sync -> Azure Files documentation
- Queue patterns and poison handling -> Queue Storage documentation
SDK Quick References
For building applications with Azure Storage SDKs, see the condensed guides:
- Blob Storage: Python | TypeScript | Java | Rust
- Queue Storage: Python | TypeScript
- File Shares: Python | TypeScript
- Data Lake: Python
- Tables: Python | Java
For full package listing across all languages, see SDK Usage Guide.
Azure SDKs
For building applications that interact with Azure Storage programmatically, Azure provides SDK packages in multiple languages (.NET, Java, JavaScript, Python, Go, Rust). See SDK Usage Guide for package names, installation commands, and quick start examples.
Source
git clone https://github.com/microsoft/GitHub-Copilot-for-Azure/blob/main/plugin/skills/azure-storage/SKILL.mdView on GitHub Overview
Azure Storage offers Blob Storage, File Shares, Queue Storage, Table Storage, and Data Lake for scalable object storage, file sharing, asynchronous messaging, NoSQL data, and big data analytics. It supports access tiers (hot, cool, archive) and lifecycle rules to optimize cost and performance. This skill covers uploading and downloading blobs, managing storage accounts, and data lake workloads.
How This Skill Works
Azure Storage runs inside storage accounts that host multiple services: blobs, files, queues, tables, and Data Lake. Developers interact via REST, SDKs, or the az CLI; you select a service, a container/share, and perform operations such as put/get/upload/download. Data can be stored in different access tiers and redundancy configurations (LRS, ZRS, GRS, GZRS) to balance cost, durability, and availability.
When to Use It
- Store unstructured data (images, videos, backups) in Blob Storage for scalable access and delivery.
- Provide SMB file shares for lift-and-shift workloads using File Shares.
- Build decoupled, asynchronous workflows with Queue Storage for task queues.
- Implement NoSQL key-value storage with Table Storage, or analyze data with Data Lake for big data workloads.
- Apply lifecycle rules and access tiers to optimize costs and data retention (hot/cool/archive).
Quick Start
- Step 1: If Azure MCP is enabled, run azure__storage storage_account_list to list storage accounts; if not, run /azure:setup or enable via /mcp.
- Step 2: List containers in an account: azure__storage storage_container_list or az storage container list --account-name ACCOUNT --output table.
- Step 3: Upload or download a blob: azure__storage storage_blob_put --container-name CONTAINER --name BLOB --file LOCAL_PATH or az storage blob upload --account-name ACCOUNT --container-name CONTAINER --name BLOB --file LOCAL_PATH; download with storage_blob_get or az storage blob download.
Best Practices
- Choose the right Storage Account tier (Standard vs Premium) based on latency and IOPS needs.
- Use blob access tiers (Hot, Cool, Archive) and lifecycle management to automatically move data over time.
- Select appropriate redundancy (LRS, ZRS, GRS, GZRS) based on disaster recovery requirements.
- Leverage Data Lake storage when hierarchical namespaces are beneficial for analytics.
- Automate common tasks with MCP tools (azure__storage commands) or the az CLI for repeatable workflows.
Example Use Cases
- Upload product images to Blob Storage and serve them via a CDN for a retail site.
- Store large telemetry datasets in Data Lake for batch and interactive analytics.
- Create SMB file shares for a development team to collaboratively edit project files.
- Queue background processing tasks using Queue Storage to decouple components.
- Archive old data to the Archive tier and restore when needed for compliance.