Learn how to Implement Office 365 With Our New Online Course!

Check out the latest online, on-demand offering from Opsgility by Opsgility author Ben Stegink:

Implementing Office 365 – Requirements and Getting Started

This course gives you an introduction to Microsoft’s Office 365 offering and the core services offered within. This course begins by walking you some of the planning and considering that goes into creating your tenant and will walk you through all the steps of getting your tenant up and running. This course will get you well on your way to earning your MCSA: Office 365. The course outline is heavily geared towards preparing you for the topics covered in Microsoft’s 70-346.

You will find all the details for this course by clicking on the hyperlink above or by visiting https://www.opsgility.com/courses.

Trying out Backup and Site Recovery (OMS): Lessons Learned

Summary of the items I learned:

  • Reminder: Azure changes often
  • Register Resource Providers
  • Chocolatey is a cool tool
  • ARMClient is a cool tool, too

For details keep reading below!

I wanted to take a little time and share some of the things I’ve learned as I have started to utilize the new items in Azure Resource Manager. Specifically, surrounding the Backup and Site Recovery (OMS) feature that is now GA. The first thing I have learned is that I need to be flexible when looking for an item in ARM because of Rule #1: Things change rapidly in Azure.

Originally, I had been evaluating this item in ARM as Recovery Services but it recently was renamed to Backup and Site Recovery (OMS). Okay, that’s maybe not a huge revelation to many of you, but a gentle reminder to all of us: Don’t get too used to things in Azure being static. It is a dynamic environment to be sure!

The first thing I ran into, was that in my MSDN Subscription, the Location was actually blank. What? Why can’t I deploy a Recovery Services Vault in ARM with my MSDN Subscription? I had another Subscription that I used for some of my training courses and using that one, I saw Location populated with all Regions that I could create a Recovery Services Vault in.

After some investigation, attempting other subscriptions and reaching out to peers (Thank you, @mscloud_stever!), it was discovered that the Resource Provider for Recovery Services was not registered. Even though I had Site Recovery registered, it was not enough to display the Location regions for my Recovery Services Vault. So, learn from my discovery: Make sure to register your Recovery Services Provider Namespace.

To check your Registered Providers, run the following command in PowerShell:

Get-AzureRmResourceProvider -ListAvailable | Where-Object -FilterScript { $PSItem.RegistrationState -eq 'Registered'; }

Here is the sample output of this command:


If you look closely, Microsoft.RecoveryServices is not in the list. If this is the case for you, run the following command to register this Provider for your subscription:

Register-AzureRmResourceProvider -ProviderNamespace Microsoft.RecoveryServices -Verbose

Once this completes, run the command to list out the registered providers again, and you should now see it registered like the screen shot below:


Now that you see this, you may need to refresh your portal or log-out and log-in again for this to now show you the region options in the Location. NOTE: This took about an hour for mine to show correctly, so you might need to wait a little time for it to recognize the registration, but most of the time it is an immediate process.

After doing this, now I could see the regions and deploy a Recovery Services Vault without issue from within the ARM Portal. Now on to some other stuff I learned.

After registering a Windows 10 client for Files and Folders Backup, registering a Azure VM to backup, registering a System Center VMM server for protection and a VMware vCenter Server for protection, I performed backups and test failovers, etc. All of this is great stuff in the ARM portal and being able to see the backups, replicate VMs, test failovers individually, and more, I started the process of removal and deletion to clean up my environment.

All was great until in the process of deleting the Resource Group, the Site Recovery Vault would not delete. Nothing I attempted to do allowed me to remove it. It kept telling me that there were registered servers in the vault that I needed to remove before attempting the deletion of the vault.

In the Azure portal, I could find no servers listed in any area of the vault. I tried many things, but nothing I did would show that a server was still registered in the vault to delete it. I opened a support incident in Azure and was contacted my Microsoft. They gave me the following option to run and ARMClient command to clean the vault. You might be asking yourself, like I did as well, “How is this done?” So, read on!

First, if you have not done so, install Chocolatey on your Windows machine. If you’re not familiar with Chocolatey, here is the location to get instructions on installation: https://chocolatey.org/ – From the website: “Chocolatey NuGet is a Machine Package Manager, somewhat like apt-get, but built with Windows in mind.

Apt-get for Windows? Cool! So, I browsed to to the homepage and found the PowerShell command to download and install via the easy install of the main page. Bingo! Once you install Chocolatey, check out the packages you can install! https://chocolatey.org/packages – You mean I can install sysinternals tools with just a command line!? C:\> choco install sysinternals – Okay, I’m distracted easily. Back to the topic at hand.

Okay, now I’ve installed Chocolatey, which I hope you’ll agree is a cool tool, now I need to install ARMClient. ARMClient is a simple command-line tool to invoke the Azure Resource Manager API. At the Chocolatey package page, type ARMClient in the Search Packages prompt and click the magnifying glass to perform the search. See screenshot below:


Once you find the results, click on the ARMClient 1.1.1 to read about the package and the commands to both install and upgrade the package. NOTE: The version number may change should there be any updates after this article was published.

Once I ran choco install armclient and ARMClient was installed, I typed ARMClient at the command prompt showing me the initial help screen and some sample commands. See the screenshot below:


After I ran ARMClient login, I entered my credentials for my Azure subscription and successfully authenticated. ARMClient enumerated my tenants and my subscriptions associated with those tenants in the resulting output. Now comes some of the fun part. I had been provided a command to try to remove registered servers in the vault. Some of the information I had to gather and enter specific to my situation, but here is the command that was useful for me:

ARMClient.exe delete subscriptions/<subscriptionId>/resourceGroups/<resourceGroupName>/providers/Microsoft.RecoveryServices/Vaults/<vaultName>/backupContainers/<serverFQDNname>/UnRegisterContainer?api-version=2015-03-15 -verbose

The highlighted areas are the edits I needed to make putting the information in that corresponded to my specific subscription, resource group, vault, and server FQDN name. After modifying the command with my info, I was able to run this command and after doing so was able to delete the vault without issue. See example output (with some information redacted for privacy) in the screenshot below:


So, all-in-all, this was a great learning exercise for me. Not only did a learn a lot about Recovery Services, Backup, ARM and Resource Providers, I learned a few new cool tools as well: Chocolatey and ARMClient. I hope you find this information useful and learn along with me!


Automate finding a unique Azure storage account name (ARM)

When building ARM-based virtual machines in Azure with PowerShell, one important step involves creating a storage account. And because the storage account name must be unique within Azure, you first need to find an unused name. To accomplish this we are still using the Service Management cmdlet, Test-AzureName.

The last time I was working through this process I thought it would be helpful to let PowerShell find a unique name for me, retrying as needed until a usable name was determined. I looked around and to my surprise I could not find something that was already written to accomplish this. I’m sure it’s out there, but I didn’t find it…so I wrote the following. I hope it will be useful for others.

This code just concatenates a string of your choice with a random number between 0 and 9999. This ‘potential’ name is evaluated for uniqueness in Azure. If it is unique it is used. If not, a new random number is generated, concatenated on the end of the string, and another check is made. This continues until an acceptable name is found. Then the storage account is created.

Note, you will first need to authenticate via the Service Management framework so you can run the ‘classic’ command to check for a unique name.

# Login to SM framework to run Test-AzureName

Next, update the $saPrefix variable to reflect the string you want in front of the random number.

# Storage account names must be between 3 and 24 characters in length and may contain numbers and lowercase letters only.
$saPrefix = "opstrainingsa"

Next, populate the name of the destination resource group (assumes it already exists) and the $location and $saType variables as desired.

$rgName = "MyResourceGroup"
$location = "South Central US"
$saType = "Standard_LRS"

Finally, login to the Resource Manager framework. As always, be sure you are focused on the correct subscription using Get-AzureRmSubscription to list available subscriptions and Set-AzureRmContext with the -SubscriptionId parameter to target the right subscription.

# Login to Resource Manager framework
# Load your subscriptions
# Select the correct subscription to work with
Set-AzureRmContext -SubscriptionId <Your Subscription ID Here>

The rest of the code should run without alteration.

$randomnumber = Get-Random -Minimum 0 -Maximum 9999
$tempName = $saPrefix + $randomnumber

$var1 = Test-AzureName -Storage -Name $tempName
If ($var1 -ne $False) {
Do {
$randomnumber = Get-Random -Minimum 0 -Maximum 9999
$tempName = $saPrefix + $randomnumber
$var1 = Test-AzureName -Storage -Name $tempName
Until ($var1 -eq $False)
$saName = $tempName
New-AzureRmStorageAccount -ResourceGroupName $rgName -Name $saName -Type $saType -Location $location

Azure Gives Storage the Cold Shoulder!

In an exciting turn of events toward the “Chilly”, side the Azure Storage team has made some big changes this week allowing for a new way to store your data!

The newly released Azure Cool Storage allows for you to choose a different tier of storage where you can store data in Azure that isn’t accessed frequently, but the difference is you still can access it using all of your familiar APIs and access points without a waiting time for it to un-thaw like in AWS.  Keep in mind that if you store your data over in Glacier you better have your mittens on because it can take as long as 5 hours to get your data out of the freezer…

Here are the details from Microsoft’s own:  Sriprasad Bhat Senior Program Manager – Azure Storage, but also I’ve included some high-lights below.

“Data in the cloud is growing at an exponential pace, and we have been working on ways to help you manage the cost of storing this data. An important aspect of managing storage costs is tiering your data based on attributes like frequency of access, retention period, etc. A common tier of customer data is cool data which is infrequently accessed but requires similar latency and performance to hot data.

Today, we are excited to announce the general availability of Cool Blob Storage – low cost storage for cool object data. Example use cases for cool storage include backups, media content, scientific data, compliance and archival data. In general, any data which lives for a longer period of time and is accessed less than once a month is a perfect candidate for cool storage.

With the new Blob storage accounts, you will be able to choose between Hot and Cool access tiers to store object data based on its access pattern. Capabilities of Blob storage accounts include:

  • Cost effective: You can now store your less frequently accessed data in the Cool access tier at a low storage cost (as low as $0.01 per GB in some regions), and your more frequently accessed data in the Hot access tier at a lower access cost. For more details on regional pricing, see Azure Storage Pricing.
  • Compatibility: We have designed Blob storage accounts to be 100% API compatible with our existing Blob storage offering which allows you to make use of the new storage accounts in existing applications seamlessly.
  • Performance: Data in both access tiers have a similar performance profile in terms of latency and throughput.
  • Availability: The Hot access tier guarantees high availability of 99.9% while the Cool access tier offers a slightly lower availability of 99%. With the RA-GRS redundancy option, we provide a higher read SLA of 99.99% for the Hot access tier and 99.9% for the Cool access tier.
  • Durability: Both access tiers provide the same high durability that you have come to expect from Azure Storage and the same data replication options that you use today.
  • Scalability and Security: Blob storage accounts provide the same scalability and security features as our existing offering.
  • Global reach: Blob storage accounts are available for use starting today in most Azure regions with additional regions coming soon. You can find the updated list of available regions on the Azure Services by Regions page.”

So, my goal in this post is to help you get started with your first run at using Azure Cool.

I put together a quick lab for you to do in which you will create a new Storage Account where you can embrace your inner hoarder, and be able to keep data for only $0.01 a GB forever and access it only every once in a while…

Let’s do this!








Building an Azure Cool Storage Account

1.  Point your web browser to http://portal.azure.com and login using your Microsoft Account that already has an Azure Subscription. 

2.  Click +New

step 2



3.  Type Storage Account in the Search Box.

step 3




4.  Next Select the Storage Account.

step 4



5.  This will open the Storage Account Information Blade.  Click Create.

step 5












6.  The Create storage account blade will then open.  Complete the first half of the blade with these settings:

a:  Storage Account Name:  This will need to be unique 3-24 characters’ alpha-numeric.
b:  Deployment Model:  Resource Manager
c:  Account Kind:  Blob Storage
d:  Replication: Read-Access Geo-Redundant Storage

step 6

7.  Next Compete the Create storage account blade using these details:

e:  Access Tier:  Cool
f:  New Resource group name:  ArcticStorageRG
g:  Location:  East US 2

step 7













8.  The Portal will start the deployment and a tile will be placed on the Dashboard while the Storage Account would be deployed.

step 8

After the Cool Storage Account is deployed you can interact with it the same way you have with the Blob endpoint of any Azure Storage Account. 



9.  Notice that the Performance/Access Tier is set as Standard/Cool and the endpoints for the two locations East US 2 and Central US.

step 9

Well that’s it!  You did it!  The data in this account will be using the new Azure Cool Blob Storage tier!




New Course! Azure Infrastructure as a Service with Resource Manager

Hot off the presses! We have a brand new course on Azure Infrastructure as a Service (IaaS) with Resource Manager delivered by our own Steve Ross.

This course is meant for new users of Azure IaaS to learn how to deploy VMs using the Azure Resource Manager deployment model. 

Course Description:
This course provides an in-depth examination of Microsoft Azure Infrastructure Services (IaaS); covering Virtual Machines and Virtual Networks starting from introductory concepts through advanced capabilities of the platform. The student will learn best practices for configuring virtual machines for performance, durability, and availability using features built into the platform. This course is designed using the Azure Resource Manager architecture and does not focus on the Azure Classic Architecture.

Course Modules:

  • Module 1: Azure Virtual Machines
  • Module 2: Virtual Machine Storage
  • Module 3: Virtual Machine Networking
  • Module 4: Implementing Hybrid Connectivity
  • Module 5: Managing Virtual Machines with Resource Manager

Upcoming Virtual Deliveries – Dynamics CRM Online and Azure!

We have several new open enrollment virtual deliveries coming up focused on Azure Developer and CRM Online (Developers, Administrators, and End Users).

If you are interested in a private delivery of any of these courses or any other training on Microsoft Cloud Technologies please .

Course Audience Dates
Microsoft Azure for the Enterprise – Free Webinar Technical Decision Makers and IT Professionals 4/28/2016 (1 hour)
Developing Enterprise Cloud Solutions with Azure .NET Developers 6/13/2016 – 6/16/2016 (4 days)
Dynamics CRM Online Customization and Configuration Dynamics CRM Online Administrators 6/13/2016 – 6/15/2016 (3 days)
Dynamics CRM Online Customization and Configuration Dynamics CRM Online End Users 6/27/2016 – 6/29/2016 (3 days)
Dynamics CRM Online for Developers .NET and Dynamics CRM Online Developers 6/27/2016 – 6/29/2016 (3 days)

New Azure AD Course

We have just launched a new course focused on Azure Active Directory (Azure AD). This is a very hands-on course, with 7 hands-on labs that explore many of the key features of Azure AD and Azure AD Premium. Try it out and let us know what you think!

Course Agenda

Module 1: Introduction to Azure AD

  •  Lab 0: Create a Lab VM for the Course (optional) and download Student Files
  •  Lab 1: Creating an Azure AD Tenet and setting the Default Directory of a Subscription

Module 2: Azure AD integration with Active Directory

  • Lab 2: Integrating an Azure AD with an On-Premise Active Directory via AD Connect

Module 3: Simplify User Access to Cloud Applications

  •  Lab 3: Enabling the use of SaaS applications through Password Single Sign-On

Module 4: Empower Users and Protect sensitive data and applications

  • Lab 6: Empowering Users through self-service Password Resets
  • Lab 7: Implement Multi-Factor Authentication to the My Applications Portal

Module 5: Introduction to Preview Features of AzureAD B2C and Domain Services

3 New Online Courses – OMS, EMS, and Azure Storage

We’ve got our subject matter experts hopping when it comes to delivering you training on the latest and greatest from Microsoft Cloud!

Hopping Subject Matter Expert

Check out our 3 newly published courses:

Course Title



Introduction to Log Analytics with OMS 1+ hours of video and 7.5 hours of lab exercises Samuel Erskine
Azure Storage for IT Professionals 6+ hours of video and almost 10.5 hours of lab exercises Robin Shahan
Using Enterprise Mobility Suite in the Real World 4+ hours of video and 5.5 hours of lab exercises Peter De Tender

As always, your comments and feedback are most appreciated!  Get in there and start learning!

Reset the Password on Azure VM – ARM

We’ve all been there.  You have an old VM that you haven’t booted up for a while, and you need to use it for something.  You open the Azure Portal, find it in a Resource Group and Click Start.

Without fail your trusty VM starts and then you click the Connect Button, but when you try to log in nothing.  You try over and over…NOTHING!


So, you wonder to yourself “Can’t I just reset the password on my Azure VM”?  You look in the portal at your VM and your dreams are coming true!  You see that there is a button named “Reset Password”.

resetpasswordSo you click the click the link, and your dreams are dashed…Seriously?!

dangAt this point it seems that the only thing you can do would be to just delete the VM and start over.

Don’t!  You can use some simple PowerShell to reset the the Administrator Account Name and Password.  Below are the instructions to make this happen.

Hope you enjoy this!







Renaming the Administrator Account and Resetting the Password on an Azure VM using PowerShell

1.  Open a PowerShell ISE Window and in the Console Pane Login to Azure Resource Manager. (Note:  This doesn’t work on Classic VMs.)


loginarmps2.  Once logged into Azure paste this code into the console pane (or download the script here)

$rgName = “RgNameHere”
$vmName = “VmNameHere”
$extName = “MyVMAccessExt”
$userName = “NewUserName”
$password = “!MyNewPasswordHere911″

Set-AzureRmVMAccessExtension -ResourceGroupName $rgName `
                                                          -VMName $vmName `
                                                          -Name $extName `
                                                          -UserName $userName `
                                                          -Password $password

3.  You will need to update the variables with the information for your VM.

  • $rgName  - This is your Resource Group where the VM is located.
  • $vmName – The Name of your VM.
  • $userName – The User Name that the built-in Administrator account will become.
  • $password - The new password for the above user account.

4.  This is how the script will look in the Console Pane once you have updated all of the variables.


5.  Once you have updated the variables you can go ahead and run the script by pressing the Green Play button.  (Note:  you will be prompted to save the script)


6.  While the script is running your VM’s Status will show as “Updating”.


7.  After the script has completed this should be the return that you receive from Azure.



You can now go back to the portal – Click Connect and use the username and password that you just put into the script and log back into your VM.

Couple of important points:

A:  The script that you just used has a password in it and when you pressed the play button it prompted you to save that file.  That means you just saved a file on your hard-drive with a clear text password.  Go change varible to something other than your password and save that file again.

B:  This post only works on Azure Resource Manager VMs.

C:  I didn’t try this on a Domain Joined VM, so not sure how it will react.  I would assume it would change the local administrator password.

Take Care,