Using AzureRM and Rubrik PowerShell Modules to Consume Azure Blob Storage

It’s no secret that I enjoy tinkering around with PowerShell to automate “all the things” and generally make life easier for those in the community. During the 5th Annual PowerShell and DevOps Global Summit (that’s a mouthful, eh?) earlier this year, I was introduced to the AzureRM module for PowerShell and knew that I wanted to fire it up and begin learning Azure at a deeper level. And since Rubrik’s Cloud Data Management platform has supported Azure blob storage as an archive target for some time now, it seemed like the most logical place to start.

Yo dawg, I heard you liked PowerShell …

In this post, I’ll cover using the Microsoft Azure PowerShell modules to authenticate to Azure; create a resource group, storage account, and storage container; and connect the newly created container to Rubrik for use as an archive location.

Installation and Authentication

The GitHub page for Azure PowerShell has all of the details for installation. I ended up using Install-Module -Name AzureRM -Scope CurrentUser to deploy the modules into my OneDrive folder from the PowerShell Gallery. For me, that location is C:\Users\chris\OneDrive\Documents\WindowsPowerShell\Modules. This is a simple way to sync modules across devices.

PS> Get-Module -ListAvailable

    Directory: C:\Users\chris\OneDrive\Documents\WindowsPowerShell\Modules

ModuleType Version    Name                                ExportedCommands
---------- -------    ----                                ----------------
Script     0.2.0      Azure.AnalysisServices              {Add-AzureAnalysisServicesAccount, Restart-AzureAnalysisSe...
Script     2.8.0      Azure.Storage                       {Get-AzureStorageTable, New-AzureStorageTableSASToken, New...
Script     3.8.0      AzureRM                             {Update-AzureRM, Import-AzureRM, Uninstall-AzureRM, Instal...
Script     0.2.0      AzureRM.AnalysisServices            {Resume-AzureRmAnalysisServicesServer, Suspend-AzureRmAnal...
Script     3.6.0      AzureRM.ApiManagement               {Add-AzureRmApiManagementRegion, Get-AzureRmApiManagementS...
Script     2.8.0      AzureRM.Automation                  {Get-AzureRMAutomationHybridWorkerGroup, Get-AzureRmAutoma...
Script     2.8.0      AzureRM.Backup                      {Backup-AzureRmBackupItem, Enable-AzureRmBackupContainerRe...
Script     2.8.0      AzureRM.Batch                       {Remove-AzureRmBatchAccount, Get-AzureRmBatchAccount, Get-...
Script     0.11.0     AzureRM.Billing                     Get-AzureRmBillingInvoice
Script     2.8.0      AzureRM.Cdn                         {Get-AzureRmCdnProfile, Get-AzureRmCdnProfileSsoUrl, New-A...
Script     0.6.0      AzureRM.CognitiveServices           {Get-AzureRmCognitiveServicesAccount, Get-AzureRmCognitive...
Script     2.9.0      AzureRM.Compute                     {Remove-AzureRmAvailabilitySet, Get-AzureRmAvailabilitySet...
Script     2.8.0      AzureRM.DataFactories               {Remove-AzureRmDataFactory, Get-AzureRmDataFactoryRun, Get...
Script     2.8.0      AzureRM.DataLakeAnalytics           {Get-AzureRmDataLakeAnalyticsDataSource, New-AzureRmDataLa...
Script     3.6.0      AzureRM.DataLakeStore               {Get-AzureRmDataLakeStoreTrustedIdProvider, Remove-AzureRm...
Script     2.8.0      AzureRM.DevTestLabs                 {Get-AzureRmDtlAllowedVMSizesPolicy, Get-AzureRmDtlAutoShu...
Script     2.8.0      AzureRM.Dns                         {Get-AzureRmDnsRecordSet, New-AzureRmDnsRecordConfig, Remo...
Script     0.2.0      AzureRM.EventHub                    {New-AzureRmEventHubKey, Get-AzureRmEventHubNamespace, Get...
Script     2.8.0      AzureRM.HDInsight                   {Get-AzureRmHDInsightJob, New-AzureRmHDInsightSqoopJobDefi...
Script     2.8.0      AzureRM.Insights                    {Get-AzureRmUsage, Get-AzureRmMetricDefinition, Get-AzureR...
Script     1.4.0      AzureRM.IotHub                      {Add-AzureRmIotHubKey, Get-AzureRmIotHubEventHubConsumerGr...
Script     2.8.0      AzureRM.KeyVault                    {Add-AzureKeyVaultCertificate, Set-AzureKeyVaultCertificat...

Once the modules are installed – and there are quite a few of them – it’s time to authenticate. The basic command is simply Add-AzureRmAccount which has an alias of Login-AzureRmAccount if you prefer. I also suggest using the SubscriptionName parameter if you have multiple subscriptions beyond Pay-As-You-Go.

$subscriptionName = 'Visual Studio Enterprise'

Add-AzureRmAccount -SubscriptionName $subscriptionName

This brings up an interactive login prompt that is only needed once for the session. To see connection details you’ll need to pull up the current context using Get-AzureRmContext. You can then see the account and subscription details. In my case, I’m using my monthly Visual Studio Enterprise credits.

PS> Get-AzureRmContext

Environment           : AzureCloud
Account               : [email protected]
TenantId              : 1234567890
SubscriptionId        : abcdefg123
SubscriptionName      : Visual Studio Enterprise
CurrentStorageAccount :

For scripting purposes I’ve created a simple try/catch logic statement that uses Get-AzureRmContext to determine if you’re already connected. If that fails, the catch portion will execute and ask for login credentials. Details on the intended subscription are then pulled. This ensures that the correct context and subscription are selected when performing work in Azure. Otherwise, I might find myself owing actual money instead of using credits. 🙂

  $subscriptionDetail = Get-AzureRmSubscription -SubscriptionName $subscriptionName -ErrorAction Stop
  if ($_.Exception -match 'Run Login-AzureRmAccount to login')
    Write-Warning -Message 'No session detected. Prompting for login.'
    $subscriptionDetail = Add-AzureRmAccount -SubscriptionName $subscriptionName -ErrorAction Stop
    throw $_

At this point I have a valid connection to Azure and am ready to start building.

Creating a Resource Group

Before building anything in Azure I need to make sure that I have a resource group. This is a high level hierarchy item that logically groups together objects such as virtual machines, network interfaces, storage accounts, and so forth. Making one is really simple: you just have to supply a name and a location.

$resourceGroup = 'wahlresgroup'
$resourceGroupLocation = 'westus'

New-AzureRmResourceGroup -Name $resourceGroup -Location $resourceGroupLocation

And … that’s it. A new resource group exists. Because I live in California, I chose the West US region which is expressed as westus.

For scripting purposes, I like to first check to see if the resource group exists before making a new one. Here’s an example:

  $resourceGroupDetail = Get-AzureRmResourceGroup -Name $resourceGroup -ErrorAction Stop
  if ($_.Exception -match 'Provided resource group does not exist')
    Write-Warning -Message "Provided resource group does not exist. Creating $resourceGroup in $resourceGroupLocation."
    $resourceGroupDetail = New-AzureRmResourceGroup -Name $resourceGroup -Location $resourceGroupLocation -ErrorAction Stop
    throw $_

This will make sure that $resourceGroupDetail will be populated with information no matter if a new or existing resource group is used. And if the error doesn’t match the generic “does not exist” sort of message, the script halts and throws the error. Pretty? Not really. But it works.

Creating a Storage Account

The storage account is used to control the containers or “buckets” if you will, including the storage type, tier, resiliency, and other factors. Creating a storage account is dependent on having a resource group – because everything has to live within a resource group – hence why the resource group was created first.

Making a storage account is more complex than a resource group. There are several optional parameters. Here’s one example:

$storageAccount = 'wahlstorageaccount'
$storageKind = 'BlobStorage'
$storageTier = 'Hot'
$storageSkuName = 'Standard_RAGRS'

New-AzureRmStorageAccount -ResourceGroupName $resourceGroup -Name $storageAccount -Kind $storageKind -AccessTier $storageTier -SkuName $storageSkuName -Location $resourceGroupLocation

While several of the parameters are user defined, there are three – the kind, tier, and SKU for the storage – that are required by Rubrik. This results in creating a storage account that will hold containers that are blob storage (instead of block or file) using the “hot” tier (data is accessed frequently because Rubrik is managing the data) in the standard read-access geo-redundant storage format (immutable data that is replicated to two other regions for high availability). The remaining variables are based on whatever you wish to name things.

At this point we have a Resource Group with one Storage Account as a member

The full script segment looks like this:

  $storageAccountDetail = Get-AzureRmStorageAccount -ResourceGroupName $resourceGroup -Name $storageAccount -ErrorAction Stop
  if ($_.Exception -match 'was not found')
    Write-Warning -Message "Provided storage account does not exist. Creating $storageAccount."
    $storageAccountDetail = New-AzureRmStorageAccount -ResourceGroupName $resourceGroup -Name $storageAccount -Kind $storageKind -AccessTier $storageTier -SkuName $storageSkuName -Location $resourceGroupLocation -ErrorAction Stop
    $storageAccountKey = ($storageAccountDetail | Get-AzureRmStorageAccountKey -ErrorAction Stop)[0].Value
    throw $_

Note that there’s also a bit of code that snags one of the storage account keys and saves it to $storageAccountKey. This is because any new storage account is assigned two keys to be rotated at will. I snag the first key to be used by Rubrik to access the storage account. This could be altered if desired.

Creating a Storage Container

The final step in Azure is to build a storage container. It is the last piece of the cloudy puzzle – huzzah!

To do this, we’ll need a private container created in the storage account’s current context. Using the New-AzureStorageContainer cmdlet with the permission and context parameters does the trick. Note that $storageAccountDetail was populated in the previous section and is being re-used here to provide the contextual details on the storage account in one parameter. Handy, yes?

$storageContainer = 'wahlcontainer'

New-AzureStorageContainer -Name $storageContainer -Permission Off -Context $storageAccountDetail.Context
A ready-to-use Storage Container

For scripting purposes, I have used a try/catch segment to … well, you know the drill by now, right? 🙂

  $storageContainerDetail = Get-AzureStorageContainer -Context $storageAccountDetail.Context -Name $storageContainer -ErrorAction Stop
  if ($_.Exception -match 'Can not find the container')
    Write-Warning -Message "Provided storage container does not exist. Creating $storageContainer."
    $storageContainerDetail = New-AzureStorageContainer -Name $storageContainer -Permission Off -Context $storageAccountDetail.Context -ErrorAction Stop
    throw $_

We now have validated that the required resource group, storage account, and storage container exist and are ready to be plugged into Rubrik as an archive location.

Adding an Archive Location to Rubrik

The only step needed for Rubrik is to take details from the Azure pieces and feed them into the Rubrik RESTful API. To do this, I’m leveraging the Rubrik PowerShell module to establish a connection to the distributed cluster and then making one API call.

There is a little bit of pre-work required. As detailed in the user guide, you’ll need to generate your own 2048 bit RSA key for encrypting the data. This way you’ll own the encryption key and no one else has any idea what it is. Being the lazy person that I am, I just rely on a tiny Ubuntu VM running on Azure to execute this openssl command:

openssl genrsa -out rubrik_encryption_key.pem 2048

I then save that key to a safe location and reference it in the script. I use Test-Path to ensure that I didn’t fat finger the path and Out-String to concatenate the file contents into a single string.

$rsaPrivateKeyPath = 'C:\Secure\rubrik_encryption_key.pem'
if (Test-Path -Path $rsaPrivateKeyPath)
  $rsaPrivateKeyDetail = Get-Content -Path $rsaPrivateKeyPath | Out-String
  throw 'Invalid RSA private key path entered.'

With that out of the way, we can ask Rubrik’s distributed task scheduler to attach the new Azure blob storage as an archive location. The accessKey, name, bucket, and secretKey values are all derived from earlier variables populated during the Azure segments, while the pemFileContent value was from the code snipet directly above this. The objectStorageType is always going to be Azure for this particular script.

$rubrikIp = ''

Connect-Rubrik -Server $rubrikIp

$body = @{
  accessKey       = $storageAccount
  name            = "Azure:$storageContainer"
  bucket          = $storageContainer
  objectStoreType = 'Azure'
  secretKey       = $storageAccountKey
  pemFileContent  = $rsaPrivateKeyDetail

$r = Invoke-WebRequest -Uri "https://$($rubrikConnection.server)/api/internal/data_location/cloud" -Method Post -Headers $rubrikConnection.header -Body ($body | ConvertTo-Json)

return ConvertFrom-Json $r.Content


While my original script was just a few lines to prove that this could be done, I ended up wrapping everything with try/catch code to make it easier to re-use existing resource group, storage account, and storage container details. I also like the idea of having some level of error handling. Here’s what it looks like when the entire script is run from start to finish.

PS> .\New-RubrikAzureContainer.ps1
WARNING: No session detected. Prompting for login.
WARNING: Provided resource group does not exist. Creating wahlresgroup in westus.
WARNING: Provided storage account does not exist. Creating wahlstorageaccount.
WARNING: Provided storage container does not exist. Creating wahlcontainer.
WARNING: You did not submit a username, password, or credentials.

Name                           Value
----                           -----
api                            v1
header                         {Authorization}
userId                         1234567890
time                           5/8/2017 13:56:27

jobInstanceId : ADD_ARCHIVAL_DATA_LOCATION_1234567890

The entire process took about 30 seconds. I then requested an on-demand backup of a small fileset just to validate that I had done everything correctly.

The results? 2397 files worth of user directory data that used 256.5 MB on the source has been stored in the Azure blog storage container using 27 MB of space. 🙂

I hope you’ve enjoyed this post and are able to start using the AzureRM modules for fun and profit!