Fun with Azure Automation and Table Service REST API

I love PowerShell and I really love to automate things! I recently started looking into leveraging Azure services for some automation tasks and discovered how powerful it could be. I also had a lot of fun doing it and wanted to share some of what I learned.

Azure Automation is for scheduling tasks or scripts that run on some sort of schedule and is especially useful for any automation you might be doing with Office 365. Your code is stored in a Runbook (PowerShell or Python) and executed according to a schedule. Interacting with modules is a little different to working with your local PowerShell installation, however the module gallery makes it pretty simple. Getting started is simple, let's assume in this example we will be automating a report in Exchange Online. First you create an Automation Account:

Create a credential set for your Exchange Online credentials - remember what you call it. "TenantCreds" in my case.

Then create a new Runbook:

Next it's time to add some PowerShell to the Runbook. Since we will be working in Exchange Online, we need to create and import that session. This is similar to working with Exchange Online sessions on your local machine, but you will notice that we don't need to include the credentials in the code and simply reference the credential set we created earlier:

$UserCredential = Get-AutomationPSCredential -Name "TenantCreds"
$Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri -Credential $UserCredential -Authentication Basic -AllowRedirection
$Commands = @("Get-MigrationBatch","Get-MigrationUser","Get-MigrationUserStatistics","Get-MoveRequestStatistics","Get-MoveRequest")
Import-PSSession -Session $Session -Prefix "EXO" -DisableNameChecking:$true -AllowClobber:$true -CommandName $Commands | Out-Null

I had some errors when trying to import all Exchange Online cmdlets, so I limit it to only the cmdlets I intend to use in the script. I also add a prefix of "EXO" to these, so these cmdlets are used as follows:

$MigBatch = Get-EXOMigrationBatch | Where-Object {$_.Identity -like '*MyMigration*'} | foreach {$_.BatchGuid}

Lastly, we need to create a schedule for the automation job:

Once the schedule has been created, you can link it to the Runbook:

This is great if you need to perform tasks that don't generate any output. What happens when something (e.g .CSV file) is generated? There are a couple of ways to deal with that. You could just use the temp folder to store your data and then email it to yourself - remember, data stored in the temp folder will not persist:

$TmpPath = $env:TEMP

Another way to deal with this data is to write it to Azure Storage. There is a PowerShell module available for Azure Storage that can be used with Azure Automation, but you can also use the APIs. Since I figured out how to use the API, it has become my go to method because it is actually much faster. I have also been able to use it in environments where it isn't possible to install modules.

The first thing we need to do is create a Storage Account in Azure:

We then create a Shared Access Signature (SAS) for that Storage Account:

The result should look similar to this:

In this example, we are going to store our script output in the Table Service, so we'll be using the Table Service REST API. When working with the Table Service it is important to understand tables, entities, system properties and other limitations, but for the purposes of this post I'm going to simplify things a little. Tables store data as collections of entities - entities are similar to rows and have a primary key and a set of properties. A property is similar to a column.

Each entity always has the following system properties:

  • PartitionKey
  • RowKey
  • Timestamp

Timestamp is managed automatically and isn't something you can change. The PartitionKey and RowKey are always required and are used for scalability and indexing of the content so it is important to consider these when designing your table. Here is some really good information to read up on. In this example, I'll looking up migration status of a mailbox in Exchange Online and will be inserting this data into a table. I'm going to use the "BatchID" as the PartitionKey and the "Status" as the RowKey. The table name in the example will use the "Alias" of the mailbox.

First, lets define the data we are going to insert. This could easily be used in a script or automation Runbook as a Foreach() loop, but to keep it simple I'm just going to manually define them in the example

$UserTable = "ZacTurner"
$PartitionKey = "Batch02"
$RowKey = "Synced"
$PrimaryEmailAddress = ""
$MbxGuid = "e31949b2-ebc6-4f57-b9ae-0aa8ae73bb2c"
$Batch = "Batch02"
$Status = "Synced"
$Skipped = "4"
$LastCheck = "2/27/2018 8:28:01 PM"

Next we will import this information, during the import, we'll first check to see if a unique table already exists (using the Alias). If one does exist, we'll insert the data, if one doesn't exist we will create it.

$AzureEndpoint = ''
$AzureSAS = "?sv=2017-07-29&amp;ss=bfqt&amp;srt=sco&amp;sp=rwdlacup&amp;se=2018-04-05T02:31:38Z&amp;st=2018-02-27T19:31:38Z&amp;spr=https&amp;sig=<removed>"
$AzureRequestHeaders = @{
		"x-ms-date"=(Get-Date -Format r);
$UserURI = $AzureEndpoint + $UserTable + "/" + $AzureSAS
#Check if table already exists
$UserTableExists = Invoke-WebRequest -Method GET -Uri $UserURI -Headers $AzureRequestHeaders
$UserTableExists = $UserTableExists.StatusCode
	If ($UserTableExists -ne "200"){
		 $TableRequestBody = ConvertTo-Json -InputObject @{
		 $EncodedTableRequestBody = [System.Text.Encoding]::UTF8.GetBytes($TableRequestBody)
		 $TableURI = $AzureEndpoint + 'Tables/' + $AzureSAS
		Invoke-WebRequest -Method POST -Uri $TableURI -Headers $AzureRequestHeaders -Body $EncodedTableRequestBody -ContentType "application/json"
#Insert data
$AzureRequestBody = ConvertTo-Json -InputObject @{
		"PartitionKey"= "$PartitionKey";
		"RowKey"= "$RowKey";
		"PrimaryEmailAddress"= "$PrimaryEmailAddress";
		"MbxGuid"= "$MbxGuid";
		"BatchName"= "$Batch";
		"Status"= "$Status";
		"ItemsSkipped"= "$Skipped";
		"LastCheck"= "$LastCheck"}
$EncodedAzureRequestBody = [System.Text.Encoding]::UTF8.GetBytes($AzureRequestBody)
Invoke-WebRequest -Method POST -Uri $UserURI -Headers $AzureRequestHeaders -Body $EncodedAzureRequestBody -ContentType "application/json"

You could also use Invoke-RestMethod instead of Invoke-WebRequest. The resulting tables should look like this:

Credit to a couple of Stack Overflow posts that were really helpful when I was trying to figure this out:

TCA Podcast Episode 11: "I'm not scared of what AI will do, but what humans will do with AI"

In this incredibly interesting episode, Warren and Nic catch up with Joe Baguley from VMware to talk what it's like speaking live in front of 90 million people and geek out about virtualization, edge computing and the endless possibilities enabled by AI.

For more information on The Cloud Architects podcast, check us out on SoundCloud. We are also available on your favorite podcast app:

My PowerShell scripts are now available on GitHub

Ever since I first started sharing scripts on my blog, I've had a bunch of people reach out to me with stories of how they've used my code in their projects or offering to collaborate with me on future versions. My intention has always been to make my scripts easily downloadable and ready to run - one of the reasons why I sign the code with a code signing certificate, but I've come to realize that there is no reason why I couldn't do both.

Over time my own development and version control methodologies have matured and I've been very successfully using git to manage my own projects. After a recent conversation with fellow MVP Michel de Rooij about his use of GitHub as a repository for his scripts, I decided to follow suit and have created a public GitHub repositories for each of my scripts. You can find these here.

I will continue to make code signed versions of my scripts available in the TechNet gallery for those who prefer to just download and use them. Links to those are below:

TCA Podcast Episode 7: “The cloud can make a real fool of you”

Change can be difficult for any organization. Changing the working culture of your user population and driving product adoption can seem overwhelming, especially when applying traditional principles and adoption practices to today's cloud world. Nic and I sat down with Patience Wootton from Dentsu Aegis Network at Ignite recently to understand what it means to be an Office 365 Product Owner and talk about change, driving user adoption and being an early Microsoft Teams adopter.

For more information on The Cloud Architects podcast, check us out on SoundCloud. We are also available on your favorite podcast app:

TCA Podcast Episode 6: "I like OneDrive, but…"

OneDrive for Business has come a long way since it was initially launched and has earned it's place as a leader in Gartner's Magic Quadrant for Content Collaboration Platforms. Nic and I sat down with Stephen Rose from the SharePoint/OneDrive team at Ignite recently to talk about some of the exciting new things that will be coming to OneDrive soon.

For more information on The Cloud Architects podcast, check us out on SoundCloud. We are also available on your favorite podcast app:

The Cloud Architects Podcast

Better late than never.. I'm delighted to annouce that I'll be joining The Cloud Architects Podcast as a co-host. Started by fellow MVPs Nicolas Blank and Warren du Toit, the Cloud Architects podcast is about best practice, guidance, news and cutting edge Microsoft cloud technologies - I'm really excited to be involved.

Microsoft Ignite gave us a great opportunity to connect with and talk to some really interesting people - those episodes will be published soon, but in the meantime, we recently recorded in introductory episode where we talk about some of the complexities of adopting cloud technologies at scale:


For more information about the podcast, visit us on the web or twitter. We are still getting started and experimenting with show formats, etc. and would love feedback so feel free to get in touch and let us know how we are doing.

Office Servers and Services MVP 2017!

The last few weeks have been incredibly busy.. I spent a week in Europe and presented a couple of sessions at Office 365 Engage in The Netherlands and after a quick laundry stop at home I headed off to Washington D.C were I presented two sessions at a small technology conference for overseas schools (K-12). That’s two conferences and four sessions in two weeks – I was pretty exhausted at the end of it all but was super excited when I received an email from the Microsoft MVP program informing me that I had been awarded my fourth Microsoft Most Valuable Professional (MVP) Award.

I’m truly honored to be part of this great community and have some ambitious plans for the next year, so stay tuned!