Exchange Hybrid Deployment and Sizing

In September, I posted about the great new Office 365 Hybrid Configuration Wizard and while there is no question that the HCW is a great help when configuring hybrid deployments, there are a few other important considerations to take into account when deploying Exchange Hybrid. I've helped many organizations deploy hybrid configurations and move mailboxes to the cloud over the last few years and often to come across the same questions and misconceptions, so I thought I'd address some of these in a blog post.

 

"Help!, I need to implement a hybrid server!"
That is not necessarily true, Exchange Hybrid is a configuration state and should not be thought of as a server role. A Hybrid deployment uses existing Exchange workloads like Autodiscover and Exchange Web Services (EWS) so if you already have Exchange 2010/2013/2016 deployed according to best practices then chances are you already have everything you need to configure Exchange Hybrid. Sure, there is some additional functionality available if you use the most recent version of Exchange, but do you need that functionality? I've seen so many environments that have correctly sized and load balanced Exchange servers and then have a tiny virtual machine deployed as a "hybrid server". This type of configuration creates a single point of failure and inevitably becomes a migration bottleneck.

If you are looking to migrate from a legacy version of Exchange then you will need to implement additional servers in order to deploy Hybrid. For Exchange 2003 your only option for going hybrid is to deploy a correctly sized Exchange 2010 deployment. For those on Exchange 2007 it is recommended that Exchange 2013 is used instead.

 

"Can I virtualize my servers for Hybrid?" or "How do I size my servers for Hybrid?"
If you are in a position where you are looking to upgrade your Exchange Organization prior to a migration to Exchange Online or you need to implement new Exchange servers because you are on a legacy version, you can definitely make use of virtualization. Virtualization in the Exchange world has long been a hot topic and isn’t really something that I’ll get into in this post. In my experience, incorrectly configured or undersized virtual Exchange servers are by far the most common issue I’ve come across in the field so it is often simpler to use physical hardware which is also the recommended practice.

To illustrate this, here is an example of some actual performance data I gathered when working with a customer. This particular customer was migrating from Exchange 2007 and had implemented virtual Exchange 2013 servers. Everything worked great until they attempted to migrate several mailboxes at the same time and they noticed that it was taking a considerable amount of time for small mailboxes to migrate. After confirming that the issue wasn’t bandwidth related we decided to take a closer look at the new virtual servers. These servers sized with 4 CPU cores and 32 GB of RAM but didn’t appear to be performing correctly. Our initial performance tests indicated that the servers seemed to be CPU constrained and after a lot of testing and much discussion with their virtualization team we found that simply changing the configuration from 2 sockets with 2 cores each (4 cores total) to 1 socket with 4 cores (still 4 cores) greatly improved the performance. The same 100 mailboxes were used in both tests:

CPU

If you are planning to virtualize, make sure you follow Microsoft’s best practices for virtualizing Exchange and always use the Exchange Server Role Requirements Calculator to correctly size your deployments.

 

"The cloud is awesome, I plan to remove all my on-premises Exchange servers!"
There is no denying that moving to the cloud makes sense for a lot of organizations and in many instances there is a desire to remove all on-premises workloads. I always advise my customers to be very careful when it comes to decommissioning their entire Exchange Organization. When using directory synchronization with your Office 365 tenant, your users are synchronized from your on-premises Active Directory and therefore most of the attributes associated with these users cannot be managed in Office 365 or Exchange Online and must be managed on-premises. Completely removing your on-premises Exchange Organization makes managing mailbox attributes more difficult so I would definitely recommend retaining at least one Exchange server for user object management. You don’t need to retain all your Exchange servers though, so I many environments there will still be a significant reduction in servers.

Retaining an on-premises Exchange server could also be really useful in SMTP relay scenarios where you have on-premises applications and devices that need to send email.

Microsoft also has a lot in great resources available to help answer your Hybrid questions, here are a few:

A look at the Microsoft Office 365 Hybrid Configuration Wizard

In case you missed it, Microsoft recently announced the Microsoft Office 365 Hybrid Configuration Wizard (HCW). The HCW has come a long way since it was first introduced in Exchange 2010 SP2, prior to that configuring a hybrid deployment required ~50 manual steps. This is the third version of the HCW and one of the most notable changes is that it is a standalone application which decouples it from the Exchange update cycle. The HCW will download the latest version every time it is run which will ensure that the latest version is always used.

The new HCW is available for use with Exchange 2013 CU8 or higher, however you will need to have Exchange 2013 CU10 installed if you would like see the new HCW landing page (pictured below) which is accessible via the “Hybrid” menu item in your on-premises Exchange Admin Center:

cap1

Those using Exchange 2013 CU8 or CU9 can download the new HCW here.

The wizard itself with take you through a series of configuration pages very similar to the pervious version, however there are improvements to error handling and a number of changes under the covers that greatly improve performance and efficiency.

cap2

cap12

cap13

It is worth noting that the log file location has changed from “%ProgramFiles%\Microsoft\Exchange Server\V15\Logging\Update-HybridConfiguration”  to “%AppData%\Microsoft\Exchange Hybrid Configuration”

The Exchange Team has a detailed post about all the great new features in the Microsoft Office 365 Configuration Wizard, I'd encourage you to check it out here.

Migrating Office for Mac users to Exchange Online

I recently worked with a customer who has a workstation fleet that is almost entirely made up of Macintosh computers. Users make use of Outlook 2011 for Mac and were being migrated to Exchange Online. We found that there wasn’t a great deal of documentation around that covers this scenario so I wanted to post some of my notes here.

According to the “Office 365 system requirements” page on office.com, Office 365 is designed to work with the following software:

  • The current or immediately previous version of Internet Explorer or Firefox, or the latest version of Chrome or Safari.
  • Any version of Microsoft Office in mainstream support.

Microsoft Support Lifecycle indicates that you would need at least Microsoft Office for Mac 2011 SP3 and I can confirm after some testing that anything prior to Office for Mac 2011 SP2 simply won’t even connect. The latest update currently available for Microsoft Office for Mac 2011 will bring the version number to 14.5.4. There is of course also the new Office 2016 for Mac.

The user experience when migrating a Mac user is similar to what we experience in the Windows world, with one exception – no restart of Outlook is required. Autodiscover will detect the mailbox move and prompt the user to re-configure. Here are some examples of this prompt in Outlook for Mac 2011 and Outlook 2016 for Mac:

2011

2016

Not sure what version your clients have installed? You can use Log Parser Studio to parse the IIS logs on your CAS servers to help determine the versions you have out there.

My Get-AZCopyGUI.ps1 script has been updated

At the end of May, I published my Get-AZCopyGUI.ps1 script which is a simple GUI wrapper for AZCopy.exe and helps simplify the process of importing .PSTs into Exchange Online. Today I published an updated version of the script which fixes a small bug and provides some additional functionality. The GUI now includes the following new options:

  • The ability to use /Pattern switch - It will automatically be set to *.PST
  • The ability to choose a custom log location. If  no location is selected, a log file is named AzCopyVerbose.log will be created in the default location which is %LocalAppData%\Microsoft\Azure\AzCopy

Get-AzcopyGUI

For more information about Get-AZCopyGUI.ps1, see this post. I have published it to the TechNet Gallery, it can be downloaded by clicking here…

Upgrading to Azure AD Connect

It’s here! Yesterday, Microsoft announced the general availability of Azure AD Connect (AAD Connect). Over the years the humble DirSync tool has evolved and AAD Connect is the simplest way yet to integrate on-premises AD identities with Azure Active Directory. There has been a lot of confusion out there about which tool to use for directory synchronization but Microsoft has been hard at work on one tool to rule them all and as noted on the Microsoft Azure site, this tool is AAD Connect:

Azure AD Connect incorporates the components and functionality previously released as Dirsync and AAD Sync. These tools are no longer being released individually, and all future improvements will be included in updates to Azure AD Connect, so that you always know where to get the most current functionality.

One of the great features of AAD Connect is that it will upgrade your existing deployment of DirSync or AAD Sync, it’s a simple wizard and 5 clicks away!

2

Click here for more Azure AD Connect information and resources.

Script: Get-AZCopyGUI.ps1 - AZCopy GUI for PST Import

Microsoft recently announced the new Office 365 Import Service which is currently in preview. The new service allows organizations to import legacy PST data using one of two methods:

  • Drive shipping – you send Microsoft a hard drive with your data.
  • Network Upload – you make use of Azure storage to upload your data to Office 365.

The network upload option makes use of the Microsoft Azure AZCopy tool which uploads your data to an Azure storage blob. Brian Reid has a great post on his blog about using the service.

Get-AZCopyGUI.ps1 is a GUI wrapper for the Microsoft Azure AZCopy tool (AZCopy.exe) to simplify the process of importing .PSTs into Exchange Online. To use the script, you need to locate your storage account key and upload URL. You can do this by accessing the “Import” menu item in the Office 365 Admin Center. To access this option, you need to be assigned the Mailbox Import Export role in Exchange Online.

post1

For more information on the Office 365 Import Service, see the following TechNet documentation.

Requirements:

  • The script will work natively in PowerShell 2.0+
  • The script requires the Microsoft Azure AZCopy Tool with default installation path - get it here

Usage:
There are no parameters or switches, simply execute the script:

[shell].\Get-AZCopyGUI.ps1[/shell]

When using the Verbose option, a log file is named AzCopyVerbose.log will be created in %LocalAppData%\Microsoft\Azure\AzCopy if no "Log Location" is specified.

Execution Policy:
The script has been digitally signed and will run just fine under a "RemoteSigned" execution policy.

Screenshots:

Get-AzcopyGUI

Download:
I have published it to the TechNet Gallery, it can be downloaded by clicking here...

Using PowerShell to bulk email your users

I was recently working on a migration project with a customer and volunteered to help find a solution to a challenge the Organizational Change Management (OCM) team were facing. The OCM team had been communicating with the business to keep them informed about the upcoming changes and what impact these changes would have on their day to day operations. This communication had all taken place via email and they now needed to send out a new notification to several thousand users that contained specific information that would be different for each recipient. Since this was a single-use scenario and all recipients were internal users, they were not terribly interested in investing in a third-party application or service to do this so we decided to explore other options.

Inspired by Pat Richard’s “New-WelcomeEmail.ps1” script, I figured it would be pretty easy to achieve this using a PowerShell script and .CSV input file, it works great! To illustrate this, I’ll use the fictitious example of gooseLabs, Inc who are relocating to a new office building and would like to send a notification email to all their users that contains their new desk location and phone number. The first and most important step is to ensure that you have an accurate input file. For this particular scenario, the input file looks something like this:

1

Once we have our input file created, we can use the magic of PowerShell to generate our notifications using this data. The following script imports the .CSV file and generates a simple email notification that is then sent to everyone in the list:

[powershell]
# Function to create report email
function SendNotification{
$Msg = New-Object Net.Mail.MailMessage
$Smtp = New-Object Net.Mail.SmtpClient($ExchangeServer)
$Msg.From = $FromAddress
$Msg.To.Add($ToAddress)
$Msg.Subject = "Announcement: Important information about your office relocation."
$Msg.Body = $EmailBody
$Msg.IsBodyHTML = $true
$Smtp.Send($Msg)
}

# Define local Exchange server info for message relay. Ensure that any servers running this script have permission to relay.
$ExchangeServer = "yourexchange.domain.com"
$FromAddress = "Office Relocation Team "

# Import user list and information from .CSV file
$Users = Import-Csv UserList.csv

# Send notification to each user in the list
Foreach ($User in $Users) {
$ToAddress = $User.Email
$Name = $User.FirstName
$Level = $User.Level
$DeskNum = $User.DeskNumber
$PhoneNum = $User.PhoneNumber
$EmailBody = @"



Dear $Name,

As you know we will be relocating to our new offices at 742 Evergreen Terrace, Springfield on July 1, 2015. This email contains important information to help you get settled as quickly as possible.

Your existing access card will grant you access to the new building and your desk location is as follows:

Level: $Level
Desk Number: $DeskNum
Phone Number: $PhoneNum

Your new phone will be connected and ready for use when you arrive.

If you require any assistance during the move please contact the relocation helpdesk at relocation@gooselabs.net or by calling 555-555-1234

Regards,

Office Relocation Team



"@
Write-Host "Sending notification to $Name ($ToAddress)" -ForegroundColor Yellow
SendNotification
}
[/powershell]

2

The resulting notification looks like this:

3

4

This method could also be used to distribute other information, like temporary passwords prior to a Cutover Exchange Migration.

Using Azure Scheduler to feed your pets (sort of)

One of the things I learned shortly after we adopted our cat Kensington is that he doesn’t care too much about my sleeping habits, if he’s hungry he’ll happily let us know. I don’t have kids and having to wake up in the early hours of the morning to feed a hungry cat was a troubling proposition so I thought I'd find a better way. After some thought, I decided we needed a automated pet feeder but I wasn’t able to find a commercially available one that met my criteria so I had to make my own which ended up being a really fun project.

My pet feeder had the following requirements:

  • It had to be internet enabled
  • It needed Wi-Fi connectivity so it could be more portable
  • It needed to be small and not an eyesore
  • It needed to support scheduling as well as manual operation

I found a few ideas online (it seems my situation isn’t unique!) but yet again none of them met my criteria. After a lot of thought, I settled on a design that I thought would be functional and give me the flexibility that I was looking for and off I went to Home Depot to get the material. I ended up with a few lengths of PVC piping of varying sizes and a polycarbonate sheet. For the electronic brain of the feeder I used the following:

  • 1 x Spark Core (Wi-Fi development board)
  • 1 x Continuous Rotation Servo
  • 1 x Green LED (Feed indicator)
  • 1 x Red LED (Power indicator)
  • 1 x Glue stick (to create a cheap linear actuator)
  • 2 x resistors, some leads and a mini breadboard

The idea is that cat food is deposited in the top part of the feeder, the Spark Core drives the actuator which lowers and then raises a platform inside the main pipe chamber. Lowering the actuator exposes the opening ‘elbow’ in the middle section of the feeder and the food then falls down the smaller section of pipe and is deposited in the bowl. I wanted to ensure that only 1-2 cups of food is dispensed and was able to do this by tweaking how wide of an opening is created when the actuator is lowered and how long it is kept open for. To remove any guessing, I added some LEDs to show me what was happening at all times. The red LED is always lit when the feeder is powered on and the green LED only lights up during a feed operation. I played around with various designs for the platform that moves up and down inside the main chamber and ended up creating an angular platform. Here are some pictures of the feeder:

DPP_0002

DPP_0003

DPP_0005

Operating the feeder is done by interacting with the Spark Core via the Spark API. Before this can be done, the board first has to have the relevant operating instructions flashed to it and I put together the following to achieve this:

[shell]
Servo myServo;  // create servo object to control a servo
int redLed = D0; // red power LED on pin D0
int greenLed = D1; //green feed LED on pin D1

void setup()
{
myServo.attach(A0);  // attaches the servo on the pin A0

Spark.function("feed", feedKenny);   // register the function

pinMode(redLed, OUTPUT);
pinMode(greenLed, OUTPUT);
}

void loop()
{
digitalWrite(redLed, HIGH); // red LED always on
}

int feedKenny(String command)  //called upon a matching POST request
{
digitalWrite(greenLed, HIGH); // turn on green LED when feeding
myServo.writeMicroseconds(1000);
delay(7500);
myServo.writeMicroseconds(1500);
delay(2500);
myServo.writeMicroseconds(2000);
delay(7500);
myServo.writeMicroseconds(1500);
digitalWrite(greenLed, LOW); // turn off green LED
return 200;
}
[/shell]

Once the Spark Core is connected to the Wi-Fi network and the application code has been flashed you can interact with it using a regular HTTP POST. For testing, you can use PowerShell and Invoke-RestMethod to do this:

[ps]
Invoke-RestMethod -Uri 'https://api.spark.io/v1/devices/YOUR-DEVICE-ID/FUNCTION?access_token=YOUR-TOKEN' -Method Post
[/ps]

pscap

PowerShell is great for testing, but not very practical for daily use, fortunately this is easily remedied with a few lines for HTML code. I created a very simple form and put it on web server on my home network. This page is not secured and is only accessible when I’m on my home network.

wp_ss_20150331_0002

Scheduling regular feeding times is really easy with Azure Scheduler. Sign in to your Azure subscription and click ‘Scheduler’

sched1

Create a new Job Collection and give it a meaningful name

image

The wizard will also give you the option of creating your first Job. Since the Spark API requires a HTTPS POST, be sure to select that option and schedule it accordingly

sched2

sched3

Once the wizard completes, you can add more Jobs to the Job Collection as appropriate:

sched4

Here is a quick demonstration of how the feeder works:

[embed]https://www.youtube.com/watch?v=O226R01mmD0[/embed]

Using a certificate to encrypt credentials in automated PowerShell scripts

PowerShell is a great way to help automate frequent or repetitive tasks and every now and then these tasks require some form of authentication. You could just store the service account password in the script, but I’m really not a fan of doing that and I’m sure you’d agree it really isn’t a very good way to do it. I was working on a script recently which was to be scheduled to run at various times by different service accounts on different servers so I wanted a way to ensure that a single copy of the script could be portable to any server and would still securely connect to Exchange with the correct permissions no matter which service account actually executed the script. Remotely connecting to Exchange/Exchange Online via PowerShell isn’t difficult to do and you could just use Get-Credential cmdlet with ConvertFrom-SecureString and Set-Content to securely save your password to file which could then be read by your script without subsequent intervention. The trouble with this solution is that it isn’t very portable and that password can only be ready by the user that created the file so it will work great if you used it only on your own machine, but not so well when trying to distribute it to a bunch of servers as a scheduled job. I wanted a solution that would use a particular certificate to decrypt a password stored in the script, in that way if the script was executed on a machine that did not have my certificate installed it would not be able to decrypt the password and would fail. I ended up creating an encrypted password using the public key of a certificate and storing that in the script, the only way to decrypt that password is with the private key of the same certificate. Let’s look at this in more detail.

The first component of the solution is a certificate. Since I already had access to a internal Windows CA, I wanted to use a certificate signed by that CA but I also tested it with a self-signed certificate that was generated using “makecert.exe”. PowerShell 4.0 includes a New-SelfSignedCertificate cmdlet that makes generating a self-signed certificate really easy, but for some reason I wasn’t able to use one of those certificates for encryption (more specifically the decryption would not work) and since I planned to use a CA signed certificate anyway I didn’t spend a whole lot of time trying to figure it out. The key thing to remember is that you need to install the certificate AND the private key, the certificate doesn’t have to be trusted. I decided to create a new certificate template for my “Script Authentication” certificate by duplicating the “Web Server” template and making a few changes to it.

Firstly, launch the Certification Authority MMC, locate “Certificate Templates”, right-click and select “Manage”

Capture1

Next, locate the “Web Server” template, right-click and select “Duplicate Template”

Capture2

I called my new template “Script Authentication”

Capture3

And you need to ensure that you “Allow private key to be exported”. Once done, apply those changes.

Capture4

We then need to publish the new template to ensure that it can be used when requesting a new certificate. Right-click “Certificate Templates”, select “New” and then “Certificate Template to Issue” and select your newly created template from the list

Capture5

Once done, you should be able to create and submit an advanced certificate request using the newly created template directly on your CA. You are not required to complete all the fields, but it is useful to give your certificate a descriptive name. I called mine “PowerShell Automation”.

Capture6

Once you have the installed the certificate, you can export it (don’t forget the private key) for use on other machines.

Capture8

I recommend storing it in a safe place and not marking the private key as exportable when moving it around, in this way you have some control over which machines can actually decrypt the password in your script.

Capture9

Here’s what my certificate looks like

Capture7

I decided to store my certificate in the computer store, it probably doesn’t matter where you store it but you would need to update the following PowerShell cmdlets appropriately. If you are going to have multiple service accounts executing your script, you need to ensure that all these accounts have permission to read the private key. This is done by right-clicking the certificate, selecting “All Tasks” and then “Manage Private Keys”.

We can use the Get-ChildItem cmdlet to locate our certificate:

[powershell]Get-ChildItem Cert:\LocalMachine\My | Where-Object {$_.Subject -like "CN=PowerShell Automation*"}[/powershell]

Next, I need to encrypt my password. To do this, I define the password as a variable, encode it and then encrypt the encoded password using my certificate’s public key:

[powershell]
$Cert = Get-ChildItem Cert:\LocalMachine\My | Where-Object {$_.Subject -like "CN=PowerShell Automation*"}
$Password = 'MyPassword'
$EncodedPwd = [system.text.encoding]::UTF8.GetBytes($Password)
$EncryptedBytes = $Cert.PublicKey.Key.Encrypt($EncodedPwd, $true)
$EncryptedPwd = [System.Convert]::ToBase64String($EncryptedBytes)
[/powershell]

Capture10

Now that I have the encrypted password, I can store it in my script and decode it using my certificate’s private key each time the script is executed. To do this, I pretty much reverse the process:

[powershell]
$EncryptedPwd = "ts32rCLLdZl3/6wINHtLD6bQO65ub….. "
$EncryptedBytes = [System.Convert]::FromBase64String($EncryptedPwd)
$DecryptedBytes = $Cert.PrivateKey.Decrypt($EncryptedBytes, $true)
$DecryptedPwd = [system.text.encoding]::UTF8.GetString($DecryptedBytes)
[/powershell]

You can build this into any scripts you have that currently require credentials, it works great for automating Office 365/Exchange Online scripting. To illustrate this, I put together a quick (and dirty!) script that can be used to provide an automated daily “Top Mail Recipient” report via email. This script can be scheduled to run daily and it will connect to Exchange Online, generate a list of the top mail recipients and email that report to the address you specify. It’s not very useful as it is, but it does show how easily you could automate things using PowerShell and serves as a great example for certificate password encryption.

[powershell]
# Function to create report email
function SendReport{
$Msg = New-Object Net.Mail.MailMessage
$Smtp = New-Object Net.Mail.SmtpClient($ExchangeServer)
$Msg.From = $FromAddress
$Msg.To.Add($ToAddress)
$Msg.Subject = "Top Mail Recipient Report for $Date"
$Msg.Body = $EmailBody
$Msg.IsBodyHTML = $true
$Smtp.Send($Msg)
}

# Define local Exchange server info for message relay. Ensure that any servers running this script have permission to relay.
$ExchangeServer = "yourexchange.domain.com"
$FromAddress = "Office 365 Reports "
$ToAddress = "you@yourdomain.com"

# Some basic HTML styling
$Header = "

"

# Connect to Exchange Online
# First decrypt the password using the certificate
$Cert = Get-ChildItem Cert:\LocalMachine\My | Where-Object {$_.Subject -like "CN=PowerShell Automation*"}
$EncryptedPwd = "ts32rCLLdZl3/6wINHtLD6bQO65ubeQ3sHj9zXbhsaQDjihQmdyoja+iL0NGXQX0DicQdXWQRu+P8dSy96ux1tLQR9ZT8WPRq8rHsR3gNXDmipCK/4CHoc5Ki7nbMKUSReprtIrnwjlXZNBocTzurBQ+LtAHvAYipD37AXVjjpwwwqud5HCXk+E4OrJGe+yIx/87neRAunqdKvyuaxUYaxeBdx2R/hpLZhxywinjjVMx+0N2RNk7H3fBEite7uuANcAg+ElAssi4DAQYYDOviIrvbjdpKogKcevAh5xEx4Wm2WBzM5XqXmj1O9TUzB9BOiUVQhDwwqCcUpb2bTNW7g=="
$EncryptedBytes = [System.Convert]::FromBase64String($EncryptedPwd)
$DecryptedBytes = $Cert.PrivateKey.Decrypt($EncryptedBytes, $true)
$DecryptedPwd = [system.text.encoding]::UTF8.GetString($DecryptedBytes) | ConvertTo-SecureString -AsPlainText -Force
# Then define Credentials and create session
$Username = "account@yourdomain.onmicrosoft.com"
$Credential = New-Object System.Management.Automation.PSCredential ($Username,$DecryptedPwd)
$Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri https://outlook.office365.com/powershell-liveid/ -Credential $Credential -Authentication Basic -AllowRedirection
Import-PSSession $Session

# Generate report data
$Date = Get-Date -DisplayHint Date
$EmailBody = Get-MailTrafficSummaryReport -Category TopMailRecipient | select @{expression={$_.C1};label=”User”}, @{expression={$_.C2};label=”Item Count”} | ConvertTo-HTML -head $Header -body "

Top Mail Recipient Report

"

# Send report
SendReport
[/powershell]

Here is an example of what the final result looks like:

Capture11