Category Archives: Microsoft Azure

Trying out Backup and Site Recovery (OMS): Lessons Learned

Summary of the items I learned:

  • Reminder: Azure changes often
  • Register Resource Providers
  • Chocolatey is a cool tool
  • ARMClient is a cool tool, too

For details keep reading below!

I wanted to take a little time and share some of the things I’ve learned as I have started to utilize the new items in Azure Resource Manager. Specifically, surrounding the Backup and Site Recovery (OMS) feature that is now GA. The first thing I have learned is that I need to be flexible when looking for an item in ARM because of Rule #1: Things change rapidly in Azure.

Originally, I had been evaluating this item in ARM as Recovery Services but it recently was renamed to Backup and Site Recovery (OMS). Okay, that’s maybe not a huge revelation to many of you, but a gentle reminder to all of us: Don’t get too used to things in Azure being static. It is a dynamic environment to be sure!

The first thing I ran into, was that in my MSDN Subscription, the Location was actually blank. What? Why can’t I deploy a Recovery Services Vault in ARM with my MSDN Subscription? I had another Subscription that I used for some of my training courses and using that one, I saw Location populated with all Regions that I could create a Recovery Services Vault in.

After some investigation, attempting other subscriptions and reaching out to peers (Thank you, @mscloud_stever!), it was discovered that the Resource Provider for Recovery Services was not registered. Even though I had Site Recovery registered, it was not enough to display the Location regions for my Recovery Services Vault. So, learn from my discovery: Make sure to register your Recovery Services Provider Namespace.

To check your Registered Providers, run the following command in PowerShell:

Get-AzureRmResourceProvider -ListAvailable | Where-Object -FilterScript { $PSItem.RegistrationState -eq 'Registered'; }

Here is the sample output of this command:

image_thumb.png

If you look closely, Microsoft.RecoveryServices is not in the list. If this is the case for you, run the following command to register this Provider for your subscription:

Register-AzureRmResourceProvider -ProviderNamespace Microsoft.RecoveryServices -Verbose

Once this completes, run the command to list out the registered providers again, and you should now see it registered like the screen shot below:

2016-05-12_10-56-38

Now that you see this, you may need to refresh your portal or log-out and log-in again for this to now show you the region options in the Location. NOTE: This took about an hour for mine to show correctly, so you might need to wait a little time for it to recognize the registration, but most of the time it is an immediate process.

After doing this, now I could see the regions and deploy a Recovery Services Vault without issue from within the ARM Portal. Now on to some other stuff I learned.

After registering a Windows 10 client for Files and Folders Backup, registering a Azure VM to backup, registering a System Center VMM server for protection and a VMware vCenter Server for protection, I performed backups and test failovers, etc. All of this is great stuff in the ARM portal and being able to see the backups, replicate VMs, test failovers individually, and more, I started the process of removal and deletion to clean up my environment.

All was great until in the process of deleting the Resource Group, the Site Recovery Vault would not delete. Nothing I attempted to do allowed me to remove it. It kept telling me that there were registered servers in the vault that I needed to remove before attempting the deletion of the vault.

In the Azure portal, I could find no servers listed in any area of the vault. I tried many things, but nothing I did would show that a server was still registered in the vault to delete it. I opened a support incident in Azure and was contacted my Microsoft. They gave me the following option to run and ARMClient command to clean the vault. You might be asking yourself, like I did as well, “How is this done?” So, read on!

First, if you have not done so, install Chocolatey on your Windows machine. If you’re not familiar with Chocolatey, here is the location to get instructions on installation: https://chocolatey.org/ – From the website: “Chocolatey NuGet is a Machine Package Manager, somewhat like apt-get, but built with Windows in mind.

Apt-get for Windows? Cool! So, I browsed to to the homepage and found the PowerShell command to download and install via the easy install of the main page. Bingo! Once you install Chocolatey, check out the packages you can install! https://chocolatey.org/packages – You mean I can install sysinternals tools with just a command line!? C:\> choco install sysinternals – Okay, I’m distracted easily. Back to the topic at hand.

Okay, now I’ve installed Chocolatey, which I hope you’ll agree is a cool tool, now I need to install ARMClient. ARMClient is a simple command-line tool to invoke the Azure Resource Manager API. At the Chocolatey package page, type ARMClient in the Search Packages prompt and click the magnifying glass to perform the search. See screenshot below:

image.png

Once you find the results, click on the ARMClient 1.1.1 to read about the package and the commands to both install and upgrade the package. NOTE: The version number may change should there be any updates after this article was published.

Once I ran choco install armclient and ARMClient was installed, I typed ARMClient at the command prompt showing me the initial help screen and some sample commands. See the screenshot below:

image_thumb.png

After I ran ARMClient login, I entered my credentials for my Azure subscription and successfully authenticated. ARMClient enumerated my tenants and my subscriptions associated with those tenants in the resulting output. Now comes some of the fun part. I had been provided a command to try to remove registered servers in the vault. Some of the information I had to gather and enter specific to my situation, but here is the command that was useful for me:

ARMClient.exe delete subscriptions/<subscriptionId>/resourceGroups/<resourceGroupName>/providers/Microsoft.RecoveryServices/Vaults/<vaultName>/backupContainers/<serverFQDNname>/UnRegisterContainer?api-version=2015-03-15 -verbose

The highlighted areas are the edits I needed to make putting the information in that corresponded to my specific subscription, resource group, vault, and server FQDN name. After modifying the command with my info, I was able to run this command and after doing so was able to delete the vault without issue. See example output (with some information redacted for privacy) in the screenshot below:

image.png

So, all-in-all, this was a great learning exercise for me. Not only did a learn a lot about Recovery Services, Backup, ARM and Resource Providers, I learned a few new cool tools as well: Chocolatey and ARMClient. I hope you find this information useful and learn along with me!

@EdwardFBear

Automate finding a unique Azure storage account name (ARM)

When building ARM-based virtual machines in Azure with PowerShell, one important step involves creating a storage account. And because the storage account name must be unique within Azure, you first need to find an unused name. To accomplish this we are still using the Service Management cmdlet, Test-AzureName.

The last time I was working through this process I thought it would be helpful to let PowerShell find a unique name for me, retrying as needed until a usable name was determined. I looked around and to my surprise I could not find something that was already written to accomplish this. I’m sure it’s out there, but I didn’t find it…so I wrote the following. I hope it will be useful for others.

This code just concatenates a string of your choice with a random number between 0 and 9999. This ‘potential’ name is evaluated for uniqueness in Azure. If it is unique it is used. If not, a new random number is generated, concatenated on the end of the string, and another check is made. This continues until an acceptable name is found. Then the storage account is created.

Note, you will first need to authenticate via the Service Management framework so you can run the ‘classic’ command to check for a unique name.

# Login to SM framework to run Test-AzureName
Add-AzureAccount

Next, update the $saPrefix variable to reflect the string you want in front of the random number.

# Storage account names must be between 3 and 24 characters in length and may contain numbers and lowercase letters only.
$saPrefix = "opstrainingsa"

Next, populate the name of the destination resource group (assumes it already exists) and the $location and $saType variables as desired.

$rgName = "MyResourceGroup"
$location = "South Central US"
$saType = "Standard_LRS"

Finally, login to the Resource Manager framework. As always, be sure you are focused on the correct subscription using Get-AzureRmSubscription to list available subscriptions and Set-AzureRmContext with the -SubscriptionId parameter to target the right subscription.

# Login to Resource Manager framework
Login-AzureRmAccount
# Load your subscriptions
Get-AzureRmSubscription
# Select the correct subscription to work with
Set-AzureRmContext -SubscriptionId <Your Subscription ID Here>

The rest of the code should run without alteration.

$randomnumber = Get-Random -Minimum 0 -Maximum 9999
$tempName = $saPrefix + $randomnumber

$var1 = Test-AzureName -Storage -Name $tempName
If ($var1 -ne $False) {
Do {
$randomnumber = Get-Random -Minimum 0 -Maximum 9999
$tempName = $saPrefix + $randomnumber
$var1 = Test-AzureName -Storage -Name $tempName
}
Until ($var1 -eq $False)
}
$saName = $tempName
New-AzureRmStorageAccount -ResourceGroupName $rgName -Name $saName -Type $saType -Location $location

Install the Azure CLI Tool on ubuntu

Welcome back to the series I’ve been doing on OSS Tools in Azure.  The Azure CLI tool is extremely powerful and available on both the Windows and Linux Platforms.  In this Blog post I’m going to help you install the Azure CLI tools on the LinuxVM that we built in the last post.

These same steps will work on your local Ubuntu Machines I just happen to be installing it on a VM that is already running in Azure.

Let’s do this!

@deltadan

dan

 

 

 

 

Task 1:  In this exercise you will install the Azure CLI Tools for Linux using the advanced packaging tool (apt).

After logging into the Unbuntu VM first you should go ahead an do a quick update to the VM.  To update the machine running the following command:

sudo apt-get update

1LinuxCLI

 Next you will install the nodejs-legacy tools by running:

sudo apt-get install nodejs-legacy

2LinuxCLI

NOTE:  You will be prompted with an amount of diskspace that will be used by the installation.  Simply press ‘Y’. 

Then you will see an output of many screens that looks like this:

4LinuxCLI

 Next install the Node Package Manager Tools with this command

sudo apt-get install npm

You will see many screens of output such as this during this part of the installation.

NOTE:  You will be prompted with an amount of diskspace that will be used by the installation.  Simply press ‘Y’.
6LinuxCLI

Final step is to use the Node Package Manager or npm to install the Azure CLI Tools.

sudo npm install -g azure-cli

You will see many screens of information as the npm is running the installation.

8LinuxCLI

After a successful installation run the Azure CLI Tool using the Azure command

azure

10LinuxCLI

Now that you have installed the tool let’s get logged into Azure!

Task 2:  Logging into Azure with the CLI Tool

At the Linux command Prompt type the following Command:

azure login

You will see a screen that asks you to open a web browser and navigate to an Azure Web Page to authenticate and a code to enter.  In my case the code is AUFNCWJD8.  When you arrive at the screen you will enter the code and then be asked to login to your Azure Subscription.

12linuxCLI

Once you have logged into your Azure Subscription Successfully you will be directed to this webpage.

13LinuxCLI

You can then close this page and move back over to your puTTY session still connected your LinuxVM.

You will now see that your Login was successful and that your Azure Subscriptions was added.

14LinuxCLI

You have now successfully logged into Azure via the CLI tool on your VM running in Azure!

Task 3:  Run some Azure Commands

Here are some commands that you can run to see items that are in your Azure subscription.  Take note that the Azure CLI tool by default is in Service Management Mode, but can see Azure Resource Manager items by changing modes.

Get a help with the Azure CLI Tool:

azure –help

16LinuxCLI

List your Azure Accounts

azure account show

17LinuxCLI

List Virtual Machines using this command:

azure vm list

15LinuxCLI

List Virtual Networks using this command:

azure network vnet list

20LinuxCLI

Well I hope this helped you get to know the Azure CLI on Linux.

@deltadan

Azure Dev/Test Labs

Introduction

There is almost universal agreement that Development and Test environments, or “DevTest” as it’s known, should be moved to the Cloud.

DevTest is probably the only workload that doesn’t have corporate issues slowing down its migration.  Aside from the security and backup of a company’s source code, there’s no reason for IT managers to balk nor are there regulatory or compliance rules that are stopping the march of DevTest to the Cloud.

That being said teams still need to get up and running with a brand new environment. They will have to build VMs, Virtual Networks etc in order to have a functional network.  This can stretch the abilities of the team as it requires quite a bit of experience working with infrastructure.

To solve these problems and let developers and testers do what they do best Microsoft has developed a turnkey solution for DevTest in the Cloud:  Azure DevTest Labs.

Azure DevTest Labs was designed with one goal is mind to enable development teams to leverage the strength and flexibility of the cloud without the complication of having to build from scratch all of the infrastructure to make that possible.

In this post I’m going to help you get up and running with Azure Dev/Test Lab.  Currently it is in preview, but you should be able to find information about it on their site here.

In my next post I will feature how to build a Linux VM with Docker in your new Lab. 

Enjoy!

@deltadan

dan

 

Create a DevTest Lab in Azure

Much like anything in Azure creating a DevTest Lab is fast and easy.  Azure takes care of the hard parts for development teams allowing them to get up in running almost immediately.

The DevTest Labs can be found in the Azure Marketplace and will require only a few inputs to consider prior to creation.  The first will be creating or choosing an Azure subscription that will be used to create the Lab and the second being the location.

 1dtlabs

Step by Step – Building an Azure DevTest Lab

In order to create the DevTest Lab use Microsoft or Organization account to login to the Azure Preview portal to start.

Open a browser and navigate to http://portal.azure.com

Enter the account associated with a Microsoft Azure subscription.

2dtlabs

 If the account is associated with an organization account and a Microsoft account there could be a prompt to choose which one to authenticate with for Microsoft Azure.

3dtlabs

 After logging into the Azure Portal next create a New DevTest Lab in Azure

Click the +NEW button in the portal.

4dtlabs

Click See all.

5dtlabs

Click Everything and then type DevTest Labs and press enter.

6dtlabs

 The Search will reveal the DevTest Labs link for you to click.

7dtlabs

 This will bring up the DevTest Labs Page.  Click Create.

8dtlabs

Then you will see the Create a DevTest Lab Blade.  Name the Lab: in the example the name ContosoDevTestLab was used.

9dtlabs

Next Choose the Correct Subscription you wish to use for this DevTest Lab.  Please note that owners of the subscription will have full access to everything in the Lab.

Chose location for the DevTest Lab assign a Location for this lab in this case I have chosen East US. 
10dtlabs

Click Auto Shutdown and ensure that the defaults are assigned to Enable On and Scheduled Shutdown 19:00.

11dtlabs

Then click Create to start the process of building the DevTest Lab.

12dtlabs

After completing these steps Azure will take care of the hard work of creating all of the resources required for the DevTest Labs.

Azure Dev/Test Labs has to be one of the coolest products I’ve seen from the Azure team to date!  It really is a fully functional environment right out of the box.

Make sure you look for the Next post where we get some Open Source Action happening in this lab!

Hope you enjoyed this post – @deltadan

Resources for passing Azure Exam 70-533 from Opsgility

If you are just now getting started with Azure Certification it is useful to have a roadmap on how to learn the technology in order to pass the certification.

Our team of industry recognized Azure experts have built a lot of Azure training over the years, and in this case we’ve literally written the book on how to pass the exam.

In this post, I want to share some of our online courses that map directly to the objectives for the exam. Each of these courses will cover the objective topic (and then some), and also provides in-depth hands-on lab guides that you can practice in your own Azure subscription to get the hands-on practical experience to really learn the skills being taught.

Objective Online Course
Implement Websites
Implement Virtual Machines
Implement Cloud Services
Implement Storage
Implement an Azure Active Directory
Implement Virtual Networks

If you haven’t tried us out yet, you can start learning Azure with a free 7-day trial here, you can cancel at any time.

New Features for Enhanced Learning

We have recently launched several new features to enhance your learning experience with Opsgility.

Course Assignments for Individuals

Course assignments allow you to assign a goal date on when to complete a class. They are great because in addition to helping you track your learning progress they also are great for bookmarking your top courses.

Course Assignments for Teams

In team mode an administrator can assign specific courses to their team and track progress to course completion. Reporting is built in to quickly give you an update on your teams progress towards the goals you assign.

Certificates Quick View

For each course that you have completed by completing the labs and knowledge measures you will receive a certificate of completion. We have added a top level link off of the account menu to quickly see and download all of your certificates from a single location.

Course Player Enhancements

Lab Files

Lab files are now a top level download instead of associated with each individual lab.

Take Notes While Learning

You can now add notes while watching a module. Each note will capture the location within the module and when selected take you back to the spot you took the note in.

Simplified discussion

Added support for Disqus to each course. Previously, access to Opsgility expert trainers was through Yammer. To simplify interaction we have added an inline Disqus feature to answer your questions about the course or the labs.

As always, we love to hear feedback on new topics or site enhancements.

New Course: Azure Site Recovery

New Course!

Opsgility has just launched a new online course by Microsoft MVP Peter De Tender, Azure Site Recovery. This course is available online or can be taught onsite at your location.

Course Description

This course provides an introduction to one of the newest cloud services available within Microsoft Azure, the Azure Site Recovery Manager (ASR). Students are introduced to the core principals of disaster recovery first, followed by an introduction to Azure Virtual Machines and Virtual Networks. In the next part, the course will dig deeper inside Azure Site Recovery Manager itself; how it works, how to configure it, how to streamline your disaster recovery failover from on-premises to Azure, and so on. Learn the concepts of protection groups, how to perform planned and unplanned failover and more.

Learn Azure Site Recovery Today!

Windows Azure – Disk Cleanup with Virtual Machines

In the latest Windows Azure Portal and PowerShell updates Microsoft has added some great functionality to manage disk cleanup with virtual machines.

Prior to these updates managing the cleanup of virtual machine disks was fairly painful. You either had to delete each disk one by one from the portal or use PowerShell code with some complex filtering and polling mechanism to remove them.

Deleting an Individual Virtual Machine and Disks from the Portal

In the portal when you select an individual virtual machine and on the bottom of the screen select Delete you are given two new options.

  • Keep the attached disks (doesn’t delete any disks)
  • Delete the attached disks (deletes all attached disks OS and Data)

delete vm windows azure portal

Deleting an Individual Virtual Machine and Disks from PowerShell

The equivelant functionality for the “Delete the attached disks” option from PowerShell is to append the -DeleteVHD parameter onto a call to Remove-AzureVM.

  Remove-AzureVM -ServiceName $serviceName -Name $vmName -DeleteVHD

Deleting all Virtual Machines and Disks in a Cloud Service from the Portal

If you need to remove all of the virtual machines and underlying disks in a specific cloud service you are covered too.
In the portal simply click CLOUD SERVICES on the left menu and find the cloud service hosting your virtual machines.

In the portal select a cloud service that contains virtual machines and on the bottom of the screen select Delete you are given three options.

  • Delete the cloud service and its deployments (deletes cloud service, all of the virtual machines (in the cloud service) and all disks attached to the virtual machines)
  • Delete all virtual machines (deletes all of the virtual machines (in the cloud service) but retains the disks)
  • Delete all virtual machines and attached disks (deletes all of the virtual machines in the cloud service and all of the disks but does not delete the cloud service)

portal delete cloud service and disks

To accomplish each of the tasks from PowerShell is straightforward.

Delete the cloud service and its deployments – equivalent PowerShell Code

Remove-AzureService -ServiceName $serviceName -DeleteAll

Delete all virtual machines (but not the cloud service or disks)

Remove-AzureDeployment -ServiceName $serviceName -Slot Production

Delete all virtual machines and attached disks (but not the cloud service)

Remove-AzureDeployment -ServiceName $serviceName -Slot Production -DeleteVHD

PowerShell for the rest
Finally, if you need to clean up disks that are no longer attached to virtual machines the PowerShell cmdlets come to the rescue.

Get-AzureDisk | where { $_.AttachedTo -eq $null } | select diskname, medialink

To delete an individual disk.

Remove-AzureDisk "disk name" -DeleteVHD

If you want to delete all of the disks that are not attached (be careful of this one – ensure you know what you are deleting before executing!).

Get-AzureDisk | where { $_.AttachedTo -eq $null } | Remove-AzureDisk -DeleteVHD

Summary
The ease of use of Windows Azure is getting better every day. What used to be a complex task (deleting disks after VM deletion) is now simplified without taking away the power that is available to the command line user. The Windows Azure team(s) are doing an amazingly good job of tackling tasks that were once difficult and making them much more manageable.

Calling the Windows Azure Management API from PowerShell

Most of the time using the Windows Azure PowerShell cmdlets will accomplish whatever task you need to automate. However, there are a few cases where directly calling the API directly is a necessity.

In this post I will walk through using the .NET HttpClient object to authenticate and call the Service Management API along with the real world example of creating a Dynamic Routing gateway (because it is not supported in the WA PowerShell cmdlets).

To authenticate to Windows Azure you need the Subscription ID and a management certificate. If you are using the Windows Azure PowerShell cmdlets you can use the built in subscription management cmdlets to pull this information.

 $sub = Get-AzureSubscription "my subscription" 
 $certificate = $sub.Certificate
 $subscriptionID = $sub.SubscriptionId

For API work my preference is to use the HttpClient class from .NET. So the next step is to create an instance of it and set it up to use the management certificate for authentication.

$handler = New-Object System.Net.Http.WebRequestHandler
 
# Add the management cert to the client certificates collection 
$handler.ClientCertificates.Add($certificate)  
$httpClient = New-Object System.Net.Http.HttpClient($handler)
 
# Set the service management API version 
$httpClient.DefaultRequestHeaders.Add("x-ms-version", "2013-08-01")
 
# WA API only uses XML 
$mediaType = New-Object System.Net.Http.Headers.MediaTypeWithQualityHeaderValue("application/xml")
$httpClient.DefaultRequestHeaders.Accept.Add($mediaType)

Now that the HttpClient object is setup to use the management certificate you need to generate a request.

The simplest request is a GET request because any parameters are just passed in the query string.

# The URI to the API you want to call 
# List Services API: http://msdn.microsoft.com/en-us/library/windowsazure/ee460781.aspx
$listServicesUri = "https://management.core.windows.net/$subscriptionID/services/hostedservices"
 
# Call the API 
$listServicesTask = $httpClient.GetAsync($listServicesUri)
$listServicesTask.Wait()
if($listServicesTask.Result.IsSuccessStatusCode -eq "True")
{
    # Get the results from the API as XML 
    [xml] $result = $listServicesTask.Result.Content.ReadAsStringAsync().Result
    foreach($svc in $result.HostedServices.HostedService)
    {
        Write-Host $svc.ServiceName " "  $svc.HostedServiceProperties.Location 
    }
}

However, if you need to do something more complex like creating a resource you can do that as well.

For example, the New-AzureVNETGateway cmdlet will create a new gateway for your virtual network but it was written prior to the introduction of Dynamic Routing gateways (and they have not been updated since…).
If you need to create a new virtual network with a dynamically routed gateway in an automated fashion calling the API is your only option.

$vnetName = "YOURVNETNAME"
 
# Create Gateway URI
# http://msdn.microsoft.com/en-us/library/windowsazure/jj154119.aspx 
$createGatewayUri = "https://management.core.windows.net/$subscriptionID/services/networking/$vnetName/gateway"
 
# This is the POST payload that describes the gateway resource 
# Note the lower case g in <gatewayType - the documentation on MSDN is wrong here
$postBody = @"
<?xml version="1.0" encoding="utf-8"?>
<CreateGatewayParameters xmlns="http://schemas.microsoft.com/windowsazure">
  <gatewayType>DynamicRouting</gatewayType>
</CreateGatewayParameters>
"@
 
Write-Host "Creating Gateway for VNET" -ForegroundColor Green
$content = New-Object System.Net.Http.StringContent($postBody, [System.Text.Encoding]::UTF8, "text/xml")        
# Add the POST payload to the call    
$postGatewayTask = $httpClient.PostAsync($createGatewayUri, $content)
$postGatewayTask.Wait()
 
# Check status for success and do cool things

So there you have it.. When the WA PowerShell cmdlets are behind the times you can quickly unblock with some direct API intervention.