Azure Automation provides credential assets for securely passing credentials between the automation account and a Desired State Configuration (DSC). Credentials can be created directly through the portal or through PowerShell and easily accessed in the DSC configuration. However, there a few disadvantages with storing credentials in an automation account vs. storing credentials in a KeyVault:

  • More fine-grained permissions can be set on a KeyVault – for example, custom access policies can be set for different principals.
  • KeyVault secret values can be viewed in the portal (assuming you have access). Passwords in an automation account credential cannot.
  • In most scenarios, a KeyVault is the “single source of truth”, where all secure assets in a tenant are stored. If you need to access credentials in an ARM template and a DSC configuration, they must be in a KeyVault for use in the ARM template.

In this example, we will walk through a very simple workstation configuration that pulls a username and password from a KeyVault, then passes those parameters into a DSC resource to create a new local user on a target VM.

Prerequisites

  • A Key Vault already provisioned
  • An Automation Account already provisioned
  • The Az.Accounts and Az.KeyVault modules imported into the Automation Account

Permissions

The Automation Connection, or more specifically, the service principal of the automation connection, needs to have at least “Get” selected under Secret permissions in the KeyVault access policies.

Creating the DSC file

These are the minimum set of parameters necessary to extract a username and password from a KeyVault:

param
(
    [Parameter(Mandatory)]
    [string] $keyVaultName,

    [Parameter(Mandatory)]
    [string] $usernameSecretName,

    [Parameter(Mandatory)]
    [string] $passwordSecretName,

    [Parameter(Mandatory)]
    [string] $automationConnectionName
)

The first 3 parameters’ purpose should be self-explanatory. The final parameter, automationConnectionName, is used to establish a connection to Azure. Even though this code is executing in the context of an Automation Account, it is not connected to Azure in the same way as if we had connected using Login-AzAccount or Connect-AzAccount. There are special cmdlets available when running in an automation account that we can use to establish a “full” connection:

$automationConnection = Get-AutomationConnection -Name $connectionName

Note that we are calling Get-AutomationConnection, NOT Get-AzAutomationConnection. The latter command only works when you have already established a connection to Azure. Get-AutomationConnection is one of those special cmdlets available when running in an Automation Account. Conversely, Get-AutomationConnection will not work if the DSC is executing outside the context of an Automation Account. For more information on connections in Azure Automation, refer to https://docs.microsoft.com/en-us/azure/automation/automation-connections

Get-AutomationConnection returns an object containing all the necessary properties for us to establish a “full” connection to Azure using the Connect-AzAccount cmdlet:

Connect-AzAccount -Tenant $automationConnection.TenantID -ApplicationId $automationConnection.ApplicationID -CertificateThumbprint $automationConnection.CertificateThumbprint 

Note that for those of you that aren’t running this in the Azure public cloud (such as Azure Government or Azure Germany), you’ll also need to add an environment switch to point to the correct cloud environment (such as -Environment AzureUSGovernment)

At this point, we can run the az cmdlets to extract the secrets from the KeyVault:

$username = (Get-AzKeyVaultSecret -VaultName $keyVaultName -Name $usernameSecretName).SecretValueText

$password = (Get-AzKeyVaultSecret -VaultName $keyVaultName -Name $passwordSecretName).SecretValue

Full Example Configuration

configuration Workstation
{
	param
	(
		[Parameter(Mandatory)]
		[string] $keyVaultName,

		[Parameter(Mandatory)]
		[string] $usernameSecretName,

		[Parameter(Mandatory)]
		[string] $passwordSecretName,

		[Parameter(Mandatory)]
		[string] $automationConnectionName
	)

	Import-DscResource -ModuleName PSDesiredStateConfiguration

	$automationConnection = Get-AutomationConnection -Name $automationConnectionName
	Connect-AzAccount -Tenant $automationConnection.TenantID -ApplicationId $automationConnection.ApplicationID -CertificateThumbprint $automationConnection.CertificateThumbprint

	$username = (Get-AzKeyVaultSecret -VaultName $keyVaultName -Name $usernameSecretName).SecretValueText

	$password = (Get-AzKeyVaultSecret -VaultName $keyVaultName -Name $passwordSecretName).SecretValue

	$credentials = New-Object System.Management.Automation.PSCredential ($username, $password)

	Node SampleWorkstation
	{
		User NonAdminUser
		{
			UserName = $username
			Password = $credentials
		}
	}
}

Final Thoughts

Remember the nature of DSC compilation – all variables are resolved at compile-time and stored in the resulting MOF file that is stored in the automation account. The compiled MOF is what is actually downloaded and executed on the target Node/VM. This means that if you change one of the secret values in the KeyVault, the MOF will still contain the old values until you recompile the DSC.

Whether you are just starting your journey to Office 365, or you are expanding your usage of the platform, it’s important that you stop and define what you hope to accomplish in your project. User research is the most productive activity your team can do to define and shape your project. Many underestimate the value of investing time and money into user research when a team believes they already understand what needs to be built.

The Need to Define the Problem

A common misunderstanding with user research is that it’s intended to help create the solution. While it’s true that user research assists in this, the main purpose of user research is to define the problem you are trying to solve.

Often, in an attempt to save money, companies will reduce or jettison altogether user research. User research ensures a higher likelihood that your implementation will succeed and is well received and adopted. This makes end users feel like they had a voice in the project and that their unique challenges were considered. And the good news is that it’s not all or nothing. There are ways to do user research that will significantly help your project without breaking the budget.

An important distinction needs to be made that user research is not about asking people what their preferences are. While preferences can lend to insights, it is not the goal of user research. Erika Hall in her book Just Enough Research says:

“As you start interviewing people involved in business and design decisions, you might hear them refer to what they do or don’t like. ‘Like’ is not a part of the critical thinker’s vocabulary. On some level, we all want the things we do to be liked (particularly on Facebook), so it’s easy to treat likability as a leading success indicator. But the concept of ‘liking’ is as subjective as it is empty. It is a superficial and self-reported mental state unmoored from any particular behavior. This means you can’t get any useful insights from any given individual reporting that they like or hate a particular thing. I like horses, but I’m not going to buy any online.” (pg. 13)

What Can I Expect When Doing User Research?

Many companies that do not have in house user research experience are unaware of the key steps and activities used. Project goals and requirements vary, requiring slightly different approaches, but the core concepts are often the same.

The first thing that usually occurs is soliciting input from the project team or stakeholders before engaging end-users. These inputs can come in the form of workshops or interviews, but it is important at this stage to understand how the stakeholders involved in commissioning and running the project view the organization’s needs.

After gathering initial input, end-users need to be identified and interviewed to understand the many aspects of how they currently work, what their needs are, and how the various tools and processes they currently use do and do not satisfy their needs.

Below are some sample questions asked during a user interview for end-users regarding their existing intranet:

  • Is there content on the intranet you looked for and were unable to find?
  • What do you do when you cannot find the information you are looking for? Has this happened with the current intranet?
  • What are other tools and applications you need to do your work?
  • What are the most important things that the organization needs from you and you need from the organization?

The answers to these questions and the insights gleaned can be distilled to define the core issues that a new modern workplace solution needs to solve. From here, the team can work together on what specific solutions will address the issues, goals, and needs of the end-users.

AIS did this recently for the ACA Compliance Group in a project to help them roll out Microsoft Teams and Planner. Through systematic user research, the AIS team was able to identify opportunities to leverage these tools to address ACA’s collaboration and content management needs. Read more about our work with ACA Compliance Group.

Other Benefits of User Research

While the primary benefit of user research is to define the problem and help your team ultimately marry that to the correct technological solution, there are many other benefits of doing user research. Here are a few.

  1. It generates interest inside of the organization. When doing research, many people will get a chance to be heard, and often times those are the very individuals that are some of the biggest supporters as the project moves along.
  2. It helps with change management and ultimately increases adoption of the final solution. Change is hard and bringing users into that process greatly increases the odds that the modern workplace solution they receive will aid them in their work. Nothing will slow down the adoption of a new solution faster than those who receive the solution feeling like their challenges were not taken into consideration.
  3. It helps your organization communicate the value of the new implementation in a way that appeals to people across the organization. It is always more impactful to frame your new investment in terms that will appeal to users.

Start Now and Continue to Iterate

If you take away one thing from this piece, I hope you realize the value of user research and how it can bring unique insights to your project that are otherwise left untapped. User research is one of those activities that truly never finishes because an organization and its people are constantly changing, but the more research is used, the better the end result.

Nielsen-Norman Group, a well-known user experience firm publishes its best intranets every year, and it is no mistake that time after time user research is a core component of these successful projects. In this year’s report, it specifically mentions the value of bringing in outside firms to bring expertise and perspective. AIS has years of experience helping organizations do great user research. If you are planning your next Office 365 project, please reach out to AIS for a Modern Workplace Assessment and begin your journey to building a successful modern workplace solution!

Implementing a cloud strategy and modernization of legacy systems are two of the core objectives of most IT organizations. Purchasing a COTS product or SaaS offering can speed up modernization and come with a lot of advantages. COTS products come with a proven track record and address specific business needs that would be difficult to justify building your own. COTS product shifts the liability of creating features to the COTS product instead of your organization. Finally, COTS products promise a shorter timeframe to implementation. Even though you’ll be purchasing a solution to 80% of your problem with the COTS product, the hardest parts of implementing a COTS product are still your responsibility. Below are some key areas you will still own.

Security and Infrastructure

Security and Infrastructure are your responsibility. Off the shelf product or Software as a Service (SaaS) product won’t address this. If this is a SaaS product, how will your Hybrid network access it and how is that access govern? You’ll need to do a risk assessment of this SaaS product which includes how you connect to it, how it stores its data, and even how it’s creating the software. If this is an off the shelf product, how will it be installed in the cloud? Ask if it can is cloud-native or does it need to run on virtual machines in the cloud. If it’s running on virtual machines are those hardened and who have access to them. Cloud virtual machines complicate networking since they need to be accessed from on-prem and they may still need to reach into the cloud or the internet. That can leave a wide surface vector you’ll need to account for. Security and Infrastructure is the biggest concern and you’ll need to own it.

Automation

One of the promises of moving to the cloud is gaining business agility. Automation is a key component for reaching that goal. SaaS product removes the burden from deploying and maintaining the application, but there may still be a necessity to automate some aspects. For example, if the SaaS product must be configured, it might have a UI and an API for doing this. It’s in your best interest to involve this SaaS Product into your normal development pipeline and consider infrastructure as code as it applies to the SaaS product. If you purchase a COTS produce be sure you can stand up an entire environment including installation of this product with a click of a button. There is no excuse for not automating everything and there are plenty of tools in the Azure DevOps pipeline for integration and automation.

Integration

COTS product provides 80% of the functionality needed, but what about the other 20%? The remaining functionality the product doesn’t provide is likely what differentiates your company from the others. There are always edge cases and custom functionalities that you’ll want, that either the vendor does not provide or it’s expensive for the vendor to implement. You’ll need to bring that work in-house or hire a system integrator to hook things together. Whether it’s a SaaS product or a COTS product, integration to applications that provide the remainder of your functionally is your responsibility as well as understanding how integration will work when purchasing the COTS product. A good COTS product will have a solid HTTP REST API. If the product you’re purchasing doesn’t have a simple API, consider a wrapper to make the integration easier. API Management is an Azure service that can do that translation for you. You might find that you want to combine COTS API to one that makes sense to the rest of the systems. Large COTS products should also support messaging of some kind. Messaging helps build loosely coupled components with high cohesion. COTS products might offer file-based and database integration. However, these should be avoided. Integration of the COTS product is your responsibility and the effort can equal the implementation effort of the COTS product itself.

Conclusion

COTS products can provide great benefits to your company and welcome new functionality quickly. Understand that your IT department will still have to drive the overall architecture and you are responsible for everything around the COTS product. The bulk of this work will fall under Security, Infrastructure, Automation, and Integration. Focus on these concerns and you’ll have a successful implementation.

An issue often faced when writing a client for an API, is how to best go about programmatically when interacting with it. You could write your own HttpClient calls, use a 3rd-party library like RESTSharp, or hope that someone has already produced an SDK for your target client language/framework. In this case .NET. Refit steps in to solve the problem of SDK development. Inspired by Square’s Retrofit library, Refit turns a REST API into a live interface.

In this blog post, I will walk you through creating both a basic SDK and client for a Todo Item API. This example assumes that a company has created the Todo API and SDK, while the client would be built by an external user.

Note: All example code for this post can be found at https://github.com/seanmcgettrick/RefitSdkDemo.

API

The purpose of this post is primarily for SDK development, as opposed to API, so I will briefly go over the API being used in this example:

API use example

RefitSdkDemo.ApiContracts

This project contains versioned request and response classes as well as a class to hold all our API routes. In a real-world example, this project would exist as a NuGet package. It may seem overkill to do this for a project with one controller and a handful of endpoints, when it comes time to create the SDK and client you will see how having the requests and responses in their own project, simplifies the process.

RefitSdkDemo.Api

This project is a .NET Core 3.1 Web API with a SQL Server (localdb) database.

SDK

To begin, we will create an empty class library project to house our SDK. Next, remove the default Class1.cs and add in the ApiContracts project as a reference, as well as the Refit NuGet package.

Add APIcontracts

Refit NuGet package

Each controller in your API should map to an interface in your SDK. Our API only has one controller, TodosController, with the following endpoints:

TodosController

Let’s create an interface for this controller. First, add a new interface named ITodoApi in the SDK project. The interface will look very similar to our controller above. For example, to create our GetAll SDK call, add the following to ITodoApi:

Todos API

The Get attribute comes from the Refit package and functions in the same manner as the HttpGet attribute in our controller. Please note, however, that for Refit, we have to supply the route as a literal string. Trying to reference the ApiRoutes class will result in an error at runtime when building the client.

GetAllAsync returns a list of TodoResponse objects wrapped in Refit’s ApiResponse class. The ApiResponse class contains the following properties:

API response class

When writing a client using our new SDK development process, our TodoResponse object will be stored in the Content property which we will look at in the next section.

To complete our interface, we add the remaining endpoints:

Add remaining end points

We now have developed a basic SDK for our Todo API. All that’s left is to create our client.

SDK Client

Create a .NET Console application and add a project reference to our new SDK:

.NET console application

In Program.cs, update the signature of Main to async since we will be making asynchronous calls to our API. The next step is to create our API client class:

Update signature

Note: You will need to find the port Visual Studio has assigned to your API when specifying the hostUrl. In this case, it is 44311.

Now that the client has been created, we can walk through creating, retrieving, updating, and deleting a todo item. You will notice that our requests and responses all follow a similar pattern.

Create a Todo

Create a Todo

Update a Todo

Update a Todo

Delete a Todo

Delete a Todo

Retrieving all Todos

Retrieving all Todos

Testing the SDK Client

To test the client first start your Web API project, and then run the console application. You can set breakpoints at each step if you like and monitor the database to see the todo item being created, updated, and deleted. Otherwise, you will see an output as such:

Testing the SDK Client

Just as an example, I have also included a project named RefitSdkDemo.PlainClient which demonstrates how a consumer of the API would have to structure a create todo item request without the benefit of our new SDK:

RefitSdkDemo.PlainClient

And since we do not have access to the SDK, the CreateTodoRequest and CreateTodoResponse classes would have to be built by the downstream consumer, and would possibly need to be updated for newer API versions.

Summary

As you can see, Refit makes it quite fast and easy to build a strongly-typed SDK for your Web API. And since Refit targets .NET Standard 2.0, you can use it almost anywhere. It is also worth mentioning, that while we did not dive into it in this blog post, Refit does support adding custom headers to requests including OAuth bearer tokens.

While personnel management is a sub-category of Human Resources (HR) that only focuses on administration, the tasks and responsibilities can outstretch the duties of an HR manager. Personnel managers hold an important role by focusing on hiring and developing employees to become more valuable to the company.  

A few of these areas of interest include: 

  • Job analyses
  • Strategic personnel planning
  • Performance appraisals
  • Benefit coordination
  • Recruitment
  • Screening
  • New employee orientation
  • Training
  • Wages
  • Dispute resolution
  • Other record-keeping duties

PowerApps and Personnel Management 

Now I bet you’re thinking how this could tie in with PowerApps. With the various areas that a personnel manager can be involved in, doesn’t it make sense to have one application where everything exists? So that this busy personnel manager can easily navigate and participate in day-to-day duties with ease, get the job done more efficiently, and have it readily available for other team members to view and analyze.

How bizarre would it be if I told you we could build this application with little to no code and have it ready to be used in less than half the time it would take for a developer to code it from scratch? Not bizarre and very doable. With PowerApps, one can quickly build custom business applications that connect to your business data stored either in the data platform, Common Data Service for Apps, or in various online and on-premise data sources like Azure, SharePoint, Excel, Office 365, Dynamics, SQL Server, and so on.  

Why PowerApps?

Apps that are built using PowerApps transform your manual business processes to digital, automated processes. Even more good news – these apps will have a responsive design and can run on any browser or your mobile device. PowerApps will potentially alleviate the need to hire expensive custom developers and this will give you the power and tools necessary to move your business forward.

Let’s Take a Closer Look

If a personnel manager is doing the following, this is how PowerApps can be integrated:

Personnel Management Duty: Posting job ads, reviewing resumes, conducting interviews and making a final decision with management.

PowerApps Solution: This can be done through the Business Process Flow. As you can see with the example below, you will be able to ensure that users enter data consistently and are taken through the same steps every time they work through this type of process.

Stages in Business Process Flow

Personnel Management Duty: Analyze salary data and reports to determine competitive compensation rates.

PowerApps Solution: Power BI, a modern data visualization tool that can spot trends in real-time and make better, more informed decisions based on your specified dataset. This example below depicts the various ways to display data using custom visualizations. Imagine the possibilities!

Sales Dashboard

Personnel Management Duty: Develop and maintain a human resources system that meets the company’s information needs.

PowerApps Solution: Using Dynamics 365, an app within PowerApps. Through the unified interface, your organization will have an application that is easy to use with the flexibility to grow.

 Personnel Management Duty

Personnel Management Duty: Continually monitor changing laws, legislation movements, arbitration decisions and collective bargaining contracts.

PowerApps Solution: Dashboard management that Dynamics 365 offers can easily check for recent changes within your system.

Sales Activity Dashboard

Personnel Management Duty: Continually deliver presentations to management and executives regarding current and future human resources policies and practices.

PowerApps Solution: Use the PowerApps Unified Interface to present detailed reports, dashboards, and forms. You’ll be able to demonstrate the versatility of the application on various devices.

 Customizing Applications

PowerApps not only gives you the capability to drive your business growth but it also eases your mind on the ability to change, update, delete, and customize your application as you see fit. Personnel management is not a simple feat but, using PowerApps can make your mission more manageable while also keeping everything in one place.

When Agencies decide to move their applications to the commercial cloud, the Defense Information Systems Agency (DISA) mandates specific approval and certification to connect the Cloud Access Point (CAP).  In this blog post, I will describe the process for establishing connectivity for a mission application on Azure Government IL 5 environment to DoD the DISA CAP. This post is based on the document provided by DISA DoD Cloud Connection Process Guide. The site has information including connecting to the DoD DISA CAP for both cloud providers (AWS, AZURE …) and cloud consuming applications (“Mission Applications”). Unfortunately, I found the document to be outdated and hard to follow. 

Here are some assumptions that you may have already done or are doing before requesting the CAP connection:

  • Designed network architecture  
  • Identified the components used in the mission application (VM, OS, Application, Database, data at rest/transit, vulnerability compliance report) 
  • Performed assessment and authorization such as:
    • Start applying STIGs to all servers. This process may take time, and you maybe continue to implement STIGs beyond the CAP connection
    • Deployed agent for HostBased Security Service (HBSS)
    • Applied patches to all VMs
    • Approved for an Interim Authority to Test (IATT) or Authority to Operate (ATO)

Registration Process 

Here is the starting point for the DISA CAP connection journey:  

  1. DISA Systems/Network Approval Process (SNAP) Registration – The SNAP database stores the required documents and provides workflow status. To obtain a SNAP account go to https://snap.dod.mil site for registration (CAC required). The registration will ask for materials such as a DoD 2875 System Authorization Access Request (SAAR) form with Justification for Access. The diagram below shows detail steps for getting the SNAP account.detailed steps for getting the SNAP account
  2. SNAP Project Registration
    • Register C-ITPS Project
    • Submit Consent to Monitor Agreement
    • Submit a Business Case Analysis
    • Submit an Initial Contact form
  3. Develop SNAP artifact package
    • Input the Ports, Protocols, and Services Management (PPSM) Registration Number – Obtained from the DoD PPSM Registry
    • DoD NIC IP Range – Request the IP address space from DISA Network Information Center (NIC)
    • Interim Authorization to Test (IATT) or an Authorization to Operate (ATO) Memo – Received a formal statement from an Authorizing Official (AO)
    • Authorizing Official (AO) Memo – AO Appointment Letter
    • Cybersecurity Service Provider (CSSP) Agreement
    • Network Diagram of the application
    • Plan of Actions & Milestones (POA&M) report
    • Security Assessment Report (SAR)
    • System Security Plan (SSP)
  4. Obtain Connection Approval
    • Submit the Package within SNAP
    • Receive Cloud Permission to Connect (CPTC) within five business days
    • Acknowledgment of Registration – DISA Cloud Approval Office (CAO) issues a CPTC Memo (or returns package for rework/resubmission)
  5. Complete Secure Cloud Computing Architecture (SCCA) Onboarding
    • Complete SCCA Onboarding Form
    • Complete DISA Firewall Request Sheet
    • Complete Express Route Steps
  6. Technical exchange meeting with DISA Engineering Team
    • Provided Service key and Peering location
    • Received shared key from the DISA

Connect to Authorized CSP-CSO (Azure)  

The steps for connecting via DISA enterprise BCAP depends on the impact level. The following steps apply for both Level 4 and 5. 

  1. Obtain IP Addresses – Requested DoD IP address range for the mission application from DoD NIC 
  2. Obtain DNS name with A record (forward lookup) and PTR (reverse DNS) for:  
    • Application
    • Federation server (ADFS)
    • Mail server 
  3. Obtain certificates for the DNS and server certificate for each VM
  4. Configure the app to use the Enterprise Email Security Gateway (EEMSG)

Create and Modify Azure ExpressRoute Circuit 

  1. Create ExpressRoute following these steps:
    Create ExpressRoute Circuit
  2. Send the service key, Peering location to the DISA BCAP team
    Sending Service Key
  3. Periodically check the status and the state of the circuit key. When the Circuit status changed to Enabled and Provider status change to Provisionedthen the connection is established.
  4. Provision Virtual Network 
  5. Change the DNS Servers to Custom and provide the following IP addressed so that traffic can pass through CAP connection (214.16.26.1, 214.71.0.1, 214.27.166.1) 
  6. Link the Virtual Network to the Express Route Circuit

AIS has a team specialized in navigating the process of getting ATO and connecting to DISA CAP. Contact us today and let’s start the conversation of getting you connected to DISA CAP.

It’s been a transformational year at AIS. We worked on some incredible projects with great clients, partners, and co-workers. We learned a lot! And we enjoyed telling you all about it here on the AIS Blog.

As we close out the year, here are the top 10 most read and shared posts of 2019*:

  1. Federated Authentication with a SAML Identity Provider, by Selvi Kalaiselvi
  2. Newly Released JSON Function for Power Apps, by Yash Agarwal
  3. So, You Want to Rehost On-Premises Applications in the Cloud? Here’s How., by Nasir Mirza
  4. Highly-Available Oracle DB on Azure IaaS, by Ben Brouse
  5. The New Windows Terminal – Install, Interact, and Customize, by Clint Richardson
  6. Cloud Transformation Can Wait… Get Me to the Cloud Now!, by Vishwas Lele
  7. HOW TO: Create an Exact Age Field in Microsoft PowerApps and Dynamics, by Chris Hettinger
  8. SharePoint Framework (SPFx) Innovation Project Part I, by Nisha Patel, Elaine Krause, and Selvi Kalaiselvi
  9. Azure Sentinel: A Tip of the Microsoft Security Iceberg, by Benyamin Famili
  10. What Is API Management?, by Udaiveer Virk

Happy New Year to all our readers and bloggers! Be sure to follow AIS on Twitter, Facebook or LinkedIn so you’ll never miss an insight. Perhaps you’ll even consider joining our team in 2020?

*We feature each of our bloggers once on the top 10 list, but we had a few top posts we would be remiss not to mention, including another blog from Yash Agarwal on How To Use Galleries in Power Apps and two more posts from Vishwas Lele, Oracle Database as a Service on Azure (Almost!) and Traffic Routing in Kubernetes via Istio and Envoy Proxy. Enjoy!