The telecommunications industry has come a long way since the days of manual service telephone exchanges. We caught up with fellow MVP Greig Sheridan late last year to talk about Skype for Business, Teams and the evolution in technology he has experienced in his 30+ years in the industry.
For more information on The Cloud Architects podcast, check us out on SoundCloud. We are also available on your favorite podcast app:
I love PowerShell and I really love to automate things! I recently started looking into leveraging Azure services for some automation tasks and discovered how powerful it could be. I also had a lot of fun doing it and wanted to share some of what I learned.
Azure Automation is for scheduling tasks or scripts that run on some sort of schedule and is especially useful for any automation you might be doing with Office 365. Your code is stored in a Runbook (PowerShell or Python) and executed according to a schedule. Interacting with modules is a little different to working with your local PowerShell installation, however the module gallery makes it pretty simple. Getting started is simple, let's assume in this example we will be automating a report in Exchange Online. First you create an Automation Account:
Create a credential set for your Exchange Online credentials - remember what you call it. "TenantCreds" in my case.
Then create a new Runbook:
Next it's time to add some PowerShell to the Runbook. Since we will be working in Exchange Online, we need to create and import that session. This is similar to working with Exchange Online sessions on your local machine, but you will notice that we don't need to include the credentials in the code and simply reference the credential set we created earlier:
I had some errors when trying to import all Exchange Online cmdlets, so I limit it to only the cmdlets I intend to use in the script. I also add a prefix of "EXO" to these, so these cmdlets are used as follows:
Lastly, we need to create a schedule for the automation job:
Once the schedule has been created, you can link it to the Runbook:
This is great if you need to perform tasks that don't generate any output. What happens when something (e.g .CSV file) is generated? There are a couple of ways to deal with that. You could just use the temp folder to store your data and then email it to yourself - remember, data stored in the temp folder will not persist:
1
$TmpPath=$env:TEMP
Another way to deal with this data is to write it to Azure Storage. There is a PowerShell module available for Azure Storage that can be used with Azure Automation, but you can also use the APIs. Since I figured out how to use the API, it has become my go to method because it is actually much faster. I have also been able to use it in environments where it isn't possible to install modules.
The first thing we need to do is create a Storage Account in Azure:
We then create a Shared Access Signature (SAS) for that Storage Account:
The result should look similar to this:
In this example, we are going to store our script output in the Table Service, so we'll be using the Table Service REST API. When working with the Table Service it is important to understand tables, entities, system properties and other limitations, but for the purposes of this post I'm going to simplify things a little. Tables store data as collections of entities - entities are similar to rows and have a primary key and a set of properties. A property is similar to a column.
Each entity always has the following system properties:
PartitionKey
RowKey
Timestamp
Timestamp is managed automatically and isn't something you can change. The PartitionKey and RowKey are always required and are used for scalability and indexing of the content so it is important to consider these when designing your table. Here is some really good information to read up on. In this example, I'll looking up migration status of a mailbox in Exchange Online and will be inserting this data into a table. I'm going to use the "BatchID" as the PartitionKey and the "Status" as the RowKey. The table name in the example will use the "Alias" of the mailbox.
First, lets define the data we are going to insert. This could easily be used in a script or automation Runbook as a Foreach() loop, but to keep it simple I'm just going to manually define them in the example
Next we will import this information, during the import, we'll first check to see if a unique table already exists (using the Alias). If one does exist, we'll insert the data, if one doesn't exist we will create it.
In this incredibly interesting episode, Warren and Nic catch up with Joe Baguley from VMware to talk what it's like speaking live in front of 90 million people and geek out about virtualization, edge computing and the endless possibilities enabled by AI.
For more information on The Cloud Architects podcast, check us out on SoundCloud. We are also available on your favorite podcast app:
In this episode, we caught up with MVP Steve Goodman to discuss a simplistic approach to implementing Device Management and why it needn’t be so complex.
For more information on The Cloud Architects podcast, check us out on SoundCloud. We are also available on your favorite podcast app:
Ever since I first started sharing scripts on my blog, I've had a bunch of people reach out to me with stories of how they've used my code in their projects or offering to collaborate with me on future versions. My intention has always been to make my scripts easily downloadable and ready to run - one of the reasons why I sign the code with a code signing certificate, but I've come to realize that there is no reason why I couldn't do both.
Over time my own development and version control methodologies have matured and I've been very successfully using git to manage my own projects. After a recent conversation with fellow MVP Michel de Rooij about his use of GitHub as a repository for his scripts, I decided to follow suit and have created a public GitHub repositories for each of my scripts. You can find these here.
I will continue to make code signed versions of my scripts available in the TechNet gallery for those who prefer to just download and use them. Links to those are below:
SharePoint can be intimidating, especially for those of use who don’t work with it everyday. In this episode, @WonderLaura tells us why SharePoint isn’t so scary and we discuss some of the SharePoint announcements at Ignite.
For more information on The Cloud Architects podcast, check us out on SoundCloud. We are also available on your favorite podcast app:
Change can be difficult for any organization. Changing the working culture of your user population and driving product adoption can seem overwhelming, especially when applying traditional principles and adoption practices to today's cloud world. Nic and I sat down with Patience Wootton from Dentsu Aegis Network at Ignite recently to understand what it means to be an Office 365 Product Owner and talk about change, driving user adoption and being an early Microsoft Teams adopter.
For more information on The Cloud Architects podcast, check us out on SoundCloud. We are also available on your favorite podcast app:
OneDrive for Business has come a long way since it was initially launched and has earned it's place as a leader in Gartner's Magic Quadrant for Content Collaboration Platforms. Nic and I sat down with Stephen Rose from the SharePoint/OneDrive team at Ignite recently to talk about some of the exciting new things that will be coming to OneDrive soon.
For more information on The Cloud Architects podcast, check us out on SoundCloud. We are also available on your favorite podcast app:
Better late than never.. I'm delighted to annouce that I'll be joining The Cloud Architects Podcast as a co-host. Started by fellow MVPs Nicolas Blank and Warren du Toit, the Cloud Architects podcast is about best practice, guidance, news and cutting edge Microsoft cloud technologies - I'm really excited to be involved.
Microsoft Ignite gave us a great opportunity to connect with and talk to some really interesting people - those episodes will be published soon, but in the meantime, we recently recorded in introductory episode where we talk about some of the complexities of adopting cloud technologies at scale:
For more information about the podcast, visit us on the web or twitter. We are still getting started and experimenting with show formats, etc. and would love feedback so feel free to get in touch and let us know how we are doing.