Azure Automation provides credential assets for securely passing credentials between the automation account and a Desired State Configuration (DSC). Credentials can be created directly through the portal or through PowerShell and easily accessed in the DSC configuration. However, there a few disadvantages with storing credentials in an automation account vs. storing credentials in a KeyVault:

  • More fine-grained permissions can be set on a KeyVault – for example, custom access policies can be set for different principals.
  • KeyVault secret values can be viewed in the portal (assuming you have access). Passwords in an automation account credential cannot.
  • In most scenarios, a KeyVault is the “single source of truth”, where all secure assets in a tenant are stored. If you need to access credentials in an ARM template and a DSC configuration, they must be in a KeyVault for use in the ARM template.

In this example, we will walk through a very simple workstation configuration that pulls a username and password from a KeyVault, then passes those parameters into a DSC resource to create a new local user on a target VM.

Prerequisites

  • A Key Vault already provisioned
  • An Automation Account already provisioned
  • The Az.Accounts and Az.KeyVault modules imported into the Automation Account

Permissions

The Automation Connection, or more specifically, the service principal of the automation connection, needs to have at least “Get” selected under Secret permissions in the KeyVault access policies.

Creating the DSC file

These are the minimum set of parameters necessary to extract a username and password from a KeyVault:

param
(
    [Parameter(Mandatory)]
    [string] $keyVaultName,

    [Parameter(Mandatory)]
    [string] $usernameSecretName,

    [Parameter(Mandatory)]
    [string] $passwordSecretName,

    [Parameter(Mandatory)]
    [string] $automationConnectionName
)

The first 3 parameters’ purpose should be self-explanatory. The final parameter, automationConnectionName, is used to establish a connection to Azure. Even though this code is executing in the context of an Automation Account, it is not connected to Azure in the same way as if we had connected using Login-AzAccount or Connect-AzAccount. There are special cmdlets available when running in an automation account that we can use to establish a “full” connection:

$automationConnection = Get-AutomationConnection -Name $connectionName

Note that we are calling Get-AutomationConnection, NOT Get-AzAutomationConnection. The latter command only works when you have already established a connection to Azure. Get-AutomationConnection is one of those special cmdlets available when running in an Automation Account. Conversely, Get-AutomationConnection will not work if the DSC is executing outside the context of an Automation Account. For more information on connections in Azure Automation, refer to https://docs.microsoft.com/en-us/azure/automation/automation-connections

Get-AutomationConnection returns an object containing all the necessary properties for us to establish a “full” connection to Azure using the Connect-AzAccount cmdlet:

Connect-AzAccount -Tenant $automationConnection.TenantID -ApplicationId $automationConnection.ApplicationID -CertificateThumbprint $automationConnection.CertificateThumbprint 

Note that for those of you that aren’t running this in the Azure public cloud (such as Azure Government or Azure Germany), you’ll also need to add an environment switch to point to the correct cloud environment (such as -Environment AzureUSGovernment)

At this point, we can run the az cmdlets to extract the secrets from the KeyVault:

$username = (Get-AzKeyVaultSecret -VaultName $keyVaultName -Name $usernameSecretName).SecretValueText

$password = (Get-AzKeyVaultSecret -VaultName $keyVaultName -Name $passwordSecretName).SecretValue

Full Example Configuration

configuration Workstation
{
	param
	(
		[Parameter(Mandatory)]
		[string] $keyVaultName,

		[Parameter(Mandatory)]
		[string] $usernameSecretName,

		[Parameter(Mandatory)]
		[string] $passwordSecretName,

		[Parameter(Mandatory)]
		[string] $automationConnectionName
	)

	Import-DscResource -ModuleName PSDesiredStateConfiguration

	$automationConnection = Get-AutomationConnection -Name $automationConnectionName
	Connect-AzAccount -Tenant $automationConnection.TenantID -ApplicationId $automationConnection.ApplicationID -CertificateThumbprint $automationConnection.CertificateThumbprint

	$username = (Get-AzKeyVaultSecret -VaultName $keyVaultName -Name $usernameSecretName).SecretValueText

	$password = (Get-AzKeyVaultSecret -VaultName $keyVaultName -Name $passwordSecretName).SecretValue

	$credentials = New-Object System.Management.Automation.PSCredential ($username, $password)

	Node SampleWorkstation
	{
		User NonAdminUser
		{
			UserName = $username
			Password = $credentials
		}
	}
}

Final Thoughts

Remember the nature of DSC compilation – all variables are resolved at compile-time and stored in the resulting MOF file that is stored in the automation account. The compiled MOF is what is actually downloaded and executed on the target Node/VM. This means that if you change one of the secret values in the KeyVault, the MOF will still contain the old values until you recompile the DSC.

Copyright: <a href='https://www.123rf.com/profile_viktorus'>viktorus / 123RF Stock Photo</a>

I thought Per Werngren made some important observations in his recent article for Redmond Channel Partner Magazine. His main point: System Integrators (SIs) need to evolve their business models or risk disintermediation. As workloads are migrated to AWS and Azure, automation replaces the need for people to perform those tasks. This automation enables governance and compliance to standards, while also setting the stage for better downstream, fully-automated management, monitoring and operations. This, of course, further reduces the need for people performing in those roles,

Meanwhile, the new generation of intelligent PaaS services for predictive analytics, artificial intelligence, machine learning, etc. are also replacing jobs once done by hand. These new tools allow us to build better and more intelligent applications.

FREE HALF DAY SESSION: APP MODERNIZATION APPROACHES & BEST PRACTICES
Transform your business into a modern enterprise that engages customers, supports innovation, and has a competitive advantage, all while cutting costs with cloud-based app modernization.

Despite all this potential for automation, we still regularly see organizations allowing contractors to move workloads manually. It’s simply in a staffing contractor’s best interest to have people do this, despite it being a time-consuming and error-prone process. But why would an SI recommend automation and reduce their long-term revenue? Read More…

Microsoft has over a thousand Virtual Machine images available in the Microsoft Azure Marketplace. If your organization has their own on-premises “Gold Image” that’s been tailored, hardened, and adapted to meet specific organizational requirements (compliance, business, security, etc.), you can bring those images into your Azure subscription for reuse, automation, and/or manageability.

I recently had the opportunity to take a client’s virtualized Windows Server 2008 R2 “Gold Image” in .OVA format (VMware ), extract the contents using 7-Zip, run the Microsoft Virtual Machine Converter to create a VHD, prepare and upload the VHD, and create a Managed Image that was then deployed using PowerShell and an Azure Resource Manager Template.

It’s actually quite simple! Here’s how… Read More…

Please read Part One and Part Two

Part 3: Azure Automation, Azure RunBooks, and Octopus Deploy

With just PowerShell and an Azure ARM template, we can kick off a deployment in just a few minutes. But there are still some manual steps involved – you still need to login to your Azure subscription, enter a command to create a new resource group, and enter another command to kick off a deployment. With the help of an Azure automation account and a platform called Octopus Deploy, we can automate this process even further to a point where it takes as little as three clicks to deploy your whole infrastructure! Read More…