I recently had the privilege and opportunity to attend this year’s DEF CON conference, one of the world’s largest and most notable hacker conventions, held annually in Las Vegas. Deciding what talks and sessions to attend can be a logistics nightmare for a conference that has anywhere between 20,000 – 30,000 people in attendance, but I pinpointed the ones that I felt would be beneficial for myself and AIS.

During the conference, Tanya Janca, a cloud advocate for Microsoft, and Teri Radichel from 2nd Sight Lab did a presentation on “DIY Azure Security Assessment” that dove into how to verify the security of your Azure environments. More specifically they went into detail on using Azure Security Center, and setting scope, policies, and threat protection. With this post, I want to share what I took away from the talk I found most helpful.

Security in Azure

Security is a huge part of deploying any implementation in Azure and ensuring fail-safes are in place to stop attacks before they occur. I will break down the topics I took away that can help you better understand and perform your own security assessment in Azure along with looking for vulnerabilities and gaps.

The first step in securing your Azure environment is to find the scope at which you are trying to assess and protect. This could also include things external to Azure, such as hybrid solutions with on-premises. These items include the following:

  • Data Protection
  • Application Security
  • Network Security
  • Access Controls
  • Cloud Security Controls
  • Cloud Provider Security
  • Governance
  • Architecture

Second, is using the tools and features within Azure in order to accomplish this objective. Tanya and Teri started out by listing a few key features that every Azure implementation should use. This includes:

  • Turning on Multi-Factor Authentication (MFA)
  • Identity and Access Management (IAM)
    • Roles in Azure AD
    • Policies for access
    • Service accounts
      • Least privilege
    • Account Structure and Governance
      • Management Groups
      • Subscriptions
      • Resource Groups

A key item I took away from this section was allowing access at the “least privileged” level using service accounts, meaning only the required permissions should be granted when needed using accounts that are not for administrative use. Along with tightening access, it’s also important to understand at what level to manage this governance. Granting access at a management group level will cast a wider and more manageable net. A more defined level, such as a subscription level, could help with segregation of duties but this is heavily based on the current landscape of your groups and subscription model.

The Center for Internet Security (CIS)

So maybe now you have an understanding of what scope you want to assess the security of your Azure environment at, but do not know where to start. This is where The Center for Internet Security (CIS) can come into play. CIS is crowd-sourced security for best practices and threat prevention which includes members such as corporations, governments, and academic institutions. It was initially intended for on-premises use. However, as the cloud has grown so has the need for increased security. CIS can help you decide what best practices you should follow based on known threat vectors; these include 20 critical controls broken down into the following 3 sections:

Basic Center for Internet Security Controls

Examples of these CIS control practices could be:

  • Inventory and Control of Hardware Assets by utilizing a software inventory tool
  • Controlled Use of Administrative Privileges by setting up alerts and logs

An additional feature is the CIS Benchmark which has recommendations for best practices in various platforms and services, such as Microsoft SQL or IIS. Plus it’s free! Another cool feature that CIS offers is within the Azure Marketplace. They have pre-defined system images that are already hardened for these best practices.

CIS Offers in Azure Marketplace

The figure below shows an example benchmark for control practice that gives you the recommendation to “Restrict access to Azure AD administration portal.” This will then output audits that show what steps need to be taken to be within the scope of that best practice.

Control Practice to Restrict access to Azure AD administration portal

Azure Security Center (ASC)

In this next section, I detail the features of Azure Security Center (ASC) that I took away from this presentation and how to get started using ASC. The figure below is of the dashboard. As you can see, there are a lot of options inside the ASC dashboard, including sections such as Policy & Compliance and Resource Security Hygiene. The settings inside of those can dive deeper into resources all the way down to the VM or application level.

Azure Security Center Dashboard

Making sure you have ASC turned on should be your first step when implementing the features within it. The visuals you get in ASC are very helpful, including things like subscription coverage and your security score. Policy management is also a feature with ASC to use pre-defined and custom rules to keep your environment within the desired compliance levels.

Cloud Networking

Your network design in Azure plays a crucial role in securing against incoming attacks, including more than just closing ports. When you build a network with security in mind you not only limit your attack surface but also make spotting vulnerabilities easier; all while making it harder for attackers to infiltrate your systems. Using Network Security Groups (NSGs) and routes can also help by allowing only the required ports. You can also utilize Network Watcher to test these effective security rules. Other best practices include not making RDP, SSH, and SQL accessible from the internet. At a higher-level, below are some more networking features and options to secure Azure including:

  • Azure Firewall
    • Protecting storage accounts
    • Using logging
    • Monitored
  • VPN/Express Route
    • Encryption between on-premises and Azure
  • Bastion Host
    • Access to host using jump box feature
    • Heavy logging
  • Advanced Threat Protection
    • Alerts of threats in low, medium and high severity
    • Unusually activities such as large amounts of storage files copied
  • Just in Time (JIT)
    • Access host only when needed in a configured time frame.
    • Select IP Ranges and ports
  • Azure WAF (Web Application Firewall)
    • Layer 7 firewall for applications
    • Utilize logging and monitoring

An additional design factor to consider is the layout of your network architecture. Keeping all your resources divided into tiers can be a great security practice to minimize risk to each component. An example would be utilizing a three-tier design. This design divides a web application into three-tier (VNets). In the figure below you can see a separate web tier, app tier, and data tier. This is much more secure because the front-end web tier can still access the app tier but cannot directly talk to the data tier which helps to minimize risk to your data.

Three Tier Network Architecture: web tier, app tier, and data tier

Logging and Monitoring

Getting the best data and analytics to properly monitor and log your data is an important part of assessing your Azure environment. For those in security roles, liability is an important factor in the ‘chain of custody’. When handling security incidents, extensive logging is required to ensure you understand the full scope of the incident. This includes having logging and monitoring turned on for the following recommended items:

  • IDS/IPS
  • DLP
  • DSN
  • Firewall/WAF
  • Load Balancers
  • CDN

The next possible way to gather even more analytics is the use of a SEIM (Security Information and Event Management) like Azure Sentinel. This just adds another layer of protection to collect, detect, investigate, and respond to threats from on-premises to multi-cloud vendors. An important note of this is to make sure you tune your SEIM, so you are detecting the threats accurately and not diluting the alerts with false positives.

Advanced Data Security

The final point I want to dive into is Advanced Data Security. The protection of data in any organization should be at the top of their list of priorities. Beginning by classifying your data is an important first step to know the sensitivity of your data. This is where Data Discovery & Classification can help in labeling the sensitivity of your data. Next is utilizing the vulnerability assessment scanning which helps assess the risk level of your databases and minimize leaks. Overall, these cloud-native tools are just another great way to help secure your Azure environment.

Conclusion

In closing, Azure has a plethora of tools at your disposal within the Azure Security Center to do your own security assessment and protect yourself, your company, and your clients from future attacks. The ASC can become your hub to define and maintain a compliant security posture for your enterprise. Tanya and Teri go into great detail the steps to take and even supply a checklist you can follow yourself to assess an Azure environment.

Checklist

  1. Set scope & only test what’s in scope
  2. Verify account structure, identity, and access control
  3. Set Azure policies
  4. Turn on Azure Security Center for all subs
  5. Use cloud-native security features – threat protection and adaptive controls, file integrity monitoring, JIT, etc.
  6. Follow networking best practices, NSGs, routes, access to compute and storage, network watcher, Azure Firewall, Express Route and Bastion host
  7. Always be on top of alerts and logs for Azure WAF and Sentinel
  8. VA everything, especially SQL databases
  9. Encryption, for your disk and data (in transit and rest)
  10. Monitor all that can be monitored
  11. Follow the Azure Security Center recommendations
  12. Then call a Penetration Tester

I hope you found this post to be helpful and make you, your company, and your clients’ experience on Azure more secure. For the full presentation, including a demo on Azure Security Center, check out this link. 

I was fortunate enough to attend the Microsoft BUILD 2019 Conference in Seattle this year – the company’s annual developer conference. There was a lot of excitement and a TON of great information to consume; from both the scheduled sessions and one-on-one conversations with product team representatives. So I’m wrapping up BUILD 2019 with some of my highlights below.

(Admittedly, these highlights skew towards technologies I’m currently using most frequently – I’ve grouped some of these into related categories. Also I’m sure I’ve left out some highpoints, so I’ll plan to update this post as needed.)

AIS at BUILD 2019

However, before describing announcements or specific technology updates I noted, my number one highpoint of the week was the session that Vishwas Lele (AIS CTO and MS Azure MVP) gave on Tuesday: “Architecting Cloud-Native Apps with AKS and Cosmos DB.” This year was the first year that Microsoft allowed a few select partners to lead sessions at BUILD, so I consider his inclusion recognition of the great work he is doing to advance cloud-native technologies on Azure. His session was packed, and attendees got their money’s worth of content related to AKS, Cosmos DB, and strategies for using cloud-native conventions for the consumption of PaaS services to build resilient, globally scalable applications.

AIS Team at Microsoft Build 2019

Kubernetes and AKS

Most of the discussion about compute on Azure included at least one point related to AKS (Azure Kubernetes Service). AKS was everywhere, and one consistent theme seems to be AKS as a significant portion of the Azure “compute” offering in the future. So, there were many exciting K8s-related announcements and demonstrations which I had not previously heard, a few that stood out to me:

Azure AI

The company’s vision related to Artificial Intelligence (AI) and Machine Learning offerings is stronger than it’s ever been. This story’s been developing for the past few years, and the vision hasn’t always been crystal clear. Over the past two years, I’ve often asked the question “If I were going to start a new custom machine learning project in Azure, what services would I start with?” Usually, that answer has been “Azure Databricks” by default, but I’m now coming around to the idea that there is now a viable alternative – or at least additional tools to consider.

The BUILD 2019 conference included great sessions and content focused on Azure AI, segmented into three high-level areas:

  • Knowledge Mining: This is concerned with using Azure services to help discover hidden insights from your content – including docs, images, and other media. Sessions and announcements in this area focused on enhancements to two key services; Azure Search and a new “Form Recognizer” service.
  • Azure (Cognitive) Search is now generally available: This service uses built-in AI capabilities to discover patterns and relationships, understand the sentiment, extract key phrases, etc. without the need for specific data science expertise. Additionally, Azure allows consumers to customize results by applying custom-tuned ranking models.
  • Forms Recognizer: A new service announced in public preview. This service exposes a REST API that accepts document content (PDF, images, etc.) and extracts text, key/value pairs, and tables. The idea is that “usable data” can be gleaned from content that has been hard to unlock in the past.

Machine Learning: A set of services that enable building and deploying custom machine learning models. This area represents many capabilities on the Azure platform; I found that at this year’s conference some great new additions and enhancements were highlighted that help to answer that first “where do I start?” question. Some highlights:

  • AutoML is in public preview: This service allows a consumer to choose the “best” machine learning algorithm for a provided data set and the desired outcome. It does this by accepting the data set from the user (in preview it accepts files stored in blob storage exclusively), automatically training several different models based on this data, comparing performance, and reporting the performance to the end user.
  • Visual Interface for Azure Machine Learning Service is in public preview: This service enables consumers to build ML models using a drag and drop interface, with the ability to drop down into Python code when needed for specific activities. In many ways, this is a reincarnation of the “Azure ML Studio” service of the past, without some of the limitations that held this service back (data size restrictions, etc.).
  • Choose your underlying compute: Choose where your models are trained and run, including the Machine Learning Services managed compute environment, AKS, Virtual Machines, Azure Databricks, HDInsight clusters, or in Azure Data Lake Analytics.

AI apps and agents: This area includes Azure Cognitive Services and Azure Bot Service. Azure Cognitive Services is a set of APIs that allow developers to call pre-built AI models to enhance their applications in the areas of computer vision, speech-to-text, and language. A few data points that stuck out to me:

  • A new Cognitive Services category – “Decision”: This category will initially include three services: 1) Content Moderator, 2) Anomaly Detector (currently in preview), and 3) Personalizer (also currently in Preview). Personalizer is a service to help promote relevant content and experiences for users.
  • “Conversation Transcription”: An advanced speech to text capability.
  • Container Support Expansion: The portfolio of Cognitive Services that can be run in locally in a Docker container now includes Anomaly Detector, Speech-to-Text, and Text-to-Speech in addition to the existing text analytics and vision containers.

.NET Platform

It’s amazing for me to consider that .NET is now 17 years old – the official release of .NET 1.0 was in February 2002! And, although .NET is now on the “mature” end of the spectrum compared to many other active programming frameworks, it’s also true that there are many new .NET developers still adding C#, VB.NET, F#, or CLR-based languages to their repertoire. In fact, at BUILD 2019 the company quoted the fact that “a million new active .NET developers” were added last year alone.

One of the reasons for this is that the .NET team continues to innovate with offerings like .NET core – which it released in 2014. .NET Core is the cross-platform development stack which runs across operating systems and has been the “future” of .NET for some time.

One of the major announcements that will affect .NET developers in the future is that the next “release” of .NET core will be “.NET 5”. Yes, this means there will be one unified platform that includes legacy .NET framework components, .NET Core, and Mono. After the .NET 5 release in 2020, there will be one annual release of .NET.

.NET Schedule

A few other .NET related data points that stuck out to me as items to investigate in more detail:

  • “Blazor” got a lot of session time and seems to be a real project now. For some people, the idea of running C# in the browser can devolve into a philosophical debate. However, it’s clear that Microsoft sees enough upside that it has moved the technology beyond an “experimental” phase into a fully-supported preview.
  • .NET for Spark was released (open source) aimed to provide access to Apache Spark for .NET developers.
  • Frequent mentions of gRPC support in .NET Core. gRPC is the language agnostic remote procedure call framework published by Google.
  • NET 1.0: A cross-platform (.NET core) framework for creating custom ML models using C# or F# – without having to leave the .NET ecosystem.

Cosmos DB

BUILD 2019 also had a few great sessions and announcements related to Cosmos DB, Microsoft’s fully managed, global, multi-modal database service. My highlights:

  • Best practices for Azure Cosmos DB: Data modeling, Partitioning, and RUs: A great session given by Deborah Chen and Thomas Weiss (program managers on the Cosmos DB team). Practical, actionable examples related to how to partition, how to minimize request units (RUs) for common database calls, etc.
  • Etcd API: In Kubernetes, etcd is used to store the state and the configuration of clusters. Ensuring availability, reliability, and performance of etcd is crucial to the overall cluster health, scalability, elasticity availability, and performance of a Kubernetes cluster. The etcd API in Azure Cosmos DB allows you to use Azure Cosmos DB as the backend store for Azure Kubernete
  • Spark API: New (preview) native support for Spark through the Cosmos DB Spark API. This one is interesting to me because it has the potential to enable a “serverless experience for Apache Spark” – where the “cluster” is Cosmos DB.  I would pay close attention to the consumed RUs though!
  • Cosmos DB will support multi-model access in the future: Cosmos DB is a multi-model database, meaning you can access the data using many different APIs. However, until now this has been a choice that is made up front on the creation of the database.  In his “Inside Datacenter Architecture” session, Mark Russinovich announced that in the future, Cosmos DB would support multi-model access to the same data.
  • Jupyter notebooks running inside Azure Cosmos DB: announced in preview. A native notebook experience that supports all the Cosmos DB APIs and is accessed directly in the Azure Portal.

Other Announcements

Below are some other BUILD 2019 announcements, highlights, and data points I’m investigating in the coming weeks:

If you have any questions, feel free to reach out to me on Twitter at @Bwodicka or contact the AIS team online.

I just returned from Microsoft BUILD 2019 where I presented a session on Azure Kubernetes Services (AKS) and Cosmos. Thanks to everyone who attended. We had excellent attendance – the room was full! I like to think that the audience was there for the speaker 😊 but I’m sure the audience interest is a clear reflection of how popular AKS and Cosmos DB are becoming.

For those looking for a 2-minute overview, here it is:

In a nutshell, the focus was to discuss the combining Cloud-Native Service (like AKS) and a Managed Database

Microsoft Build Session Architecting Cloud-Native Apps with AKS and Cosmos DB Slide Deck

We started with a discussion of Cloud-Native Apps, along with a quick introduction to AKS and Cosmos. We quickly transitioned into stateful app considerations and talked about new stateful capabilities in Kubernetes including PV, PVC, Stateful Sets, CSI, and Operators. While these capabilities represent significant progress, they don’t match up with external services like Cosmos DB.

Microsoft Build Session Architecting Cloud-Native Apps with AKS and Cosmos DB Slide Deck

Microsoft Build Session Architecting Cloud-Native Apps with AKS and Cosmos DB Slide Deck Cloud Native Tooling

One option is to use Open Service Broker – It allows Kubernetes hosted services to talk to external services using cloud-native tooling like svcat (Service Catalog).

Microsoft Build Session Architecting Cloud-Native Apps with AKS and Cosmos DB Slide Deck svcat

Microsoft Build Session Architecting Cloud-Native Apps with AKS and Cosmos DB Slide Deck SRE

External services like Cosmos DB can go beyond cluster SRE and offer “turn-key” SRE in essence – Specifically, geo-replication, API-based scaling, and even multi-master writes (eliminating the need to failover).

Microsoft Build Session Architecting Cloud-Native Apps with AKS and Cosmos DB Slide Deck Mutli Master Support

Microsoft Build Session Architecting Cloud-Native Apps with AKS and Cosmos DB Slide Deck Configure Regions

Microsoft Build Session Architecting Cloud-Native Apps with AKS and Cosmos DB Slide Deck Portability

Since the Open Service Broker is an open specification, your app remains mostly portable even when you move to one cloud provider to another. OpenService Broker does not deal with syntactic differences, say connection string prefix difference between cloud providers.  One way to handle these differences is to use Helm.

Learn more about my BUILD session:

Here you can find the complete recording of the session and slide deck: https://mybuild.techcommunity.microsoft.com/sessions/77138?source=sessions#top-anchor

Additionally, you can find the code for the sample I used here: https://github.com/vlele/build2019 

WORK WITH THE BRIGHTEST LEADERS IN SOFTWARE DEVELOPMENT

In support of National Cybersecurity Awareness Month, you’re invited to join us for a very special edition of the Microsoft Azure Government Meetup!

On October 24th, the Women Leading Government Cybersecurity Meetup will feature an exciting panel of experienced government cybersecurity professionals. ALL are welcome to attend this FREE event for networking, refreshments and fascinating insights and discussions on:

  • Today’s cybersecurity landscape in government and top priorities
  • Best practices in cybersecurity along with challenges and lessons learned
  • Future of cybersecurity in government and the cyber workforce

The evening will wrap up with a security demo along with Q&A.

This Meetup is presented in partnership with the Women in Technology D.C. Chapter of the International Association of Microsoft Channel Partners, a community of local professionals that believe in making it easier for women to imagine, begin and develop a career in IT. We’re very excited about this one, so check out the full agenda, speaker bios and RSVP here!

If you’re not in the D.C. metro area, you can join us via livestream on Oct. 24 starting at 6:35 p.m. at aka.ms/azuregovmeetuplive. And be sure to join our conversation on Twitter using #AzureGovMeetup.

FREE ONE-HOUR RISK CONSULTATION
If disaster strikes, would your organization survive? Let AIS help you plan for the unknown.

Calling all SharePoint users and Office 365 developers! AIS is hosting this month’s Meetup for the Triangle SharePoint User Group in Morrisville, North Carolina. The Meetup is this Thursday at AIS’ North Carolina office. There are still a few spots left so be sure to RVSP today.

About the Session:

In this session we’ll walk through building a client-side web part with the SharePoint framework. By using generic components, we can build web parts that can be reused across an entire organization or multiple clients. Time permitting, we will walk through several examples and possibly some framework extensions.

Event Agenda:

5:45 p.m.  Doors Open
5:45 to 6:15 p.m.  Networking & Dinner
6:15 p.m.  Announcements & Introductions
6:20 to 7:40 p.m. Presentation

The TriSPUG Meetups are a fantastic way for developers, IT, and business users to learn, share, and grow their knowledge in Microsoft SharePoint and Office 365. Attendance is always free and informal. All interest levels and experience levels are welcome!

RSVP Here!

The President’s Management Agenda (PMA) called on all government agencies to accelerate their IT modernization efforts with a continued focus on security. So…now what?

At this month’s #AzureGov meetup, our panel of speakers discussed exactly how agencies can navigate the world of automated ATOs, revamped TIC compliance and beyond.  And at the same time, fully realize the benefits of the cloud and achieve greater agility while strengthening their security posture.

Last night’s speakers included:

Mark Cohn, CTO, Unisys Federal
Greg Elin, CEO, GovReady & Former Chief Data Officer, Federal Communications Commission
• Nate Johnson, Cloud Security & Compliance Director, Microsoft
• Scott Thompson, Cloud Solution Strategist, Microsoft

For a replay of the full Meetup, click here. For past Meetups, visit the Azure Government Meetup YouTube channel here.

The next Meetup is set for Wednesday, September 26. RSVP today to claim your spot and join us for great networking and presentations. We hope to see you there!

One of the biggest roadblocks to government digital transformation is the lack of effective IT governance. Unresolved concerns including privacy, security and organizational silos that limit data sharing and analysis continue to pose hurdles for agencies.

Last night’s Azure Government Meetup in Washington, D.C. featured a stellar lineup of industry-leading experts who shared insights and strategies on achieving effective IT governance in areas including identity, portfolio and records management.

If you missed it, you can catch the replay hereRead More…

Last night’s #AzureGovMeetup challenged government agencies to view the cloud as more than just a technology and software choice, but also as a business strategy to create greater impact for employees and citizens. With its agility, speed and low cost, the cloud is the key to helping agencies test and innovate new solutions faster than ever before.

First up, we heard from Kevin Jackson, a globally-recognized cloud computing expert and founder/author of the award-winning blog, Cloud Musings. Kevin shared his experience and expertise in cloud innovation and the keys to developing a successful cloud business strategy.

Brett & Jim presenting Moving a high-profile application to production Vishwas Lele

Next up was a very special cloud innovation showcase featuring the latest cloud solutions currently advancing agency missions, including:

• AIS’ own Brett Goldsmith and Jim Mullennix presented on a high-profile application they recently helped move to production in Azure Gov.
Carlton Reeves, Forward Deployed Solutions Leader, C3 IoT shared insights on how to innovate in the cloud using Azure.

Keep an eye on the official Azure Government Meetup YouTube channel, where videos of the full presentations will be posted shortly. In the meantime, why not RSVP for the July Meetup and follow the AIS Team on Twitter?

For those who are familiar with Microsoft’s popular Dashboard in a Day workshops for Power BI, App in a Day (AIAD) is very similar. The program is an all-day training designed to accelerate attendees’ PowerApps, Microsoft Flow, and CDS for Apps experience. The comprehensive hands-on workshop is led by a certified Microsoft Partner, in this case, your friends at Applied Information Sciences.

The training provides practical experience in a full-day of instructor lead App creation workshops. You learn how to build custom apps that run on mobile devices with which you can share inside your organization securely.

Inside the AIAD Workshop

Full house at the AIAD at Microsoft’s Reston MTC

Trainees at App in a Day

Read More…

AIS was pleased to present our work with the United States Air Force at the annual Armed Forces Communications and Electronics Association (AFCEA) Montgomery IT Summit on May 23.

The Air Force awarded AIS a contract to build out an Azure core infrastructure in the multi-cloud Common Computing Environment (CCE). The infrastructure is capable of supporting the migration of thousands of on-premises and legacy applications to the cloud.  At the same time, the AIS team engaged in a parallel effort to migrate the “lead horse” application into the environment and celebrated the distinction of being the first company to successfully field a production application.