Late last Friday, the news of the Joint Enterprise Defense Infrastructure (JEDI) contract award to Microsoft Azure sent seismic waves through the software industry, government, and commercial IT circles alike.

Even as the dust settles on this contract award, including the inevitable requests for reconsideration and protest, DoD’s objectives from the solicitation are apparent.

DOD’s JEDI Objectives

Public Cloud is the Future DoD IT Backbone

A quick look at the JEDI statement of objectives illustrates the government’s comprehensive enterprise expectations with this procurement:

  • Fix fragmented, largely on-premises computing and storage solutions – This fragmentation is making it impossible to make data-driven decisions at “mission-speed”, negatively impacting outcomes. Not to mention that the rise in the level of cyber-attacks requires a comprehensive, repeatable, verifiable, and measurable security posture.
  • Commercial parity with cloud offerings for all classification levels – A cordoned off dedicated government cloud that lags in features is no longer acceptable. Furthermore, it is acceptable for the unclassified data center locations to not be dedicated to a cloud exclusive to the government.
  • Globally accessible and highly available, resilient infrastructure – The need for infrastructure that is reliable, durable, and can continue to operate despite catastrophic failure of pieces of infrastructure is crucial. The infrastructure must be capable of supporting geographically dispersed users at all classification levels, including in closed-loop networks.
  • Centralized management and distributed control – Apply security policies; monitor security compliance and service usage across the network; and accredit standardized service configurations.
  • Fortified Security that enables enhanced cyber defenses from the root level – These cyber defenses are enabled through the application layer and down to the data layer with improved capabilities including continuous monitoring, auditing, and automated threat identification.
  • Edge computing and storage capabilities – These capabilities must be able to function totally disconnected, including provisioning IaaS and PaaS services and running containerized applications, data analytics, and processing data locally. These capabilities must also provide for automated bidirectional synchronization of data storage with the cloud environment when a connection is re-established.
  • Advanced data analytics – An environment that securely enables timely, data-driven decision making and supports advanced data analytics capabilities such as machine learning and artificial intelligence.

Key Considerations: Agility and Faster Time to Market

From its inception, with the Sep 2017 memo announcing the formation of Cloud Executive Steering Group that culminated with the release of RFP in July 2018, DoD has been clear – They wanted a single cloud contract. They deemed a multi-cloud approach to be too slow and costly. Pentagon’s Chief Management officer defended a single cloud approach by suggesting that multi-cloud contract “could prevent DoD from rapidly delivering new capabilities and improved effectiveness to the warfighter that enterprise-level cloud computing can enable”, resulting in “additional costs and technical complexity on the Department in adopting enterprise-scale cloud technologies under a multiple-award contract. Requiring multiple vendors to provide cloud capabilities to the global tactical edge would require investment from each vendor to scale up their capabilities, adding expense without commensurate increase in capabilities”

A Single, Unified Cloud Platform Was Required

The JEDI solicitation expected a unified cloud platform that supports a broad set of workloads, with detailed requirements for scale and long-term price projections.

  1. Unclassified webserver with a peak load of 400,000 requests per minute
  2. High volume ERP system – ~30,000 active users
  3. IoT + Tactical Edge – A set of sensors that captures 12 GB of High Definition Audio and Video data per hour
  4. Large data set analysis – 200 GB of storage per day, 4.5 TB of online result data, 4.5 TB of nearline result data, and 72 TB of offline result data
  5. Small form-factor data center – 100 PB of storage with 2000 cores that is deliverable within 30 days of request and be able to fit inside a U.S. military cargo aircraft

Massive Validation for the Azure Platform

The fact that the Azure platform is the “last cloud standing” at the end of the long and arduous selection process is massive validation from our perspective.

As other bidders have discovered, much to their chagrin, the capabilities described above are not developed overnight. It’s a testament to Microsoft’s sustained commitment to meeting the wide-ranging requirements of the JEDI solicitation.

Lately, almost every major cloud provider has invested in bringing the latest innovations in compute (GPUs, FPGAs, ASICs), storage (very high IOPS, HPC), and network (VMs with 25 Gbps bandwidth) to their respective platforms. In the end, what I believe differentiates Azure is a long-standing focus on understanding and investing in enterprise IT needs. Here are a few examples:

  • Investments in Azure Stack started 2010 with the announcement of Azure Appliance. It took over seven years of learnings to finally run Azure completely in an isolated mode. Since then, the investments in Data Box Edge, Azure Sphere and commitment to hybrid solutions have been a key differentiator for Azure.
  • With 54 Azure regions worldwide that ( available in 140 countries) including dedicated Azure government regions – US DoD Central, US DoD East, US Gov Arizona, US Gov Iowa, US Gov Texas, US Gov Virginia, US Sec East, US Sec West – Azure team has accorded the highest priority on establishing a global footprint. Additionally, having a common team that builds, manages, and secures Azure’s cloud infrastructure has meant that even the public Azure services have DoD CC SRG IL 2, FedRAMP moderate and high designations.
  • Whether it is embracing Linux or Docker, providing the highest number of contributions to GitHub projects, or open-sourcing the majority of  Azure SDKs and services, Microsoft has demonstrated a leading commitment to open source solutions.
  • Decades of investment in Microsoft Research, including the core Microsoft Research Labs and Microsoft Research AI, has meant that they have the most well-rounded story for advanced data analytics and AI.
  • Documentation and ease of use have been accorded the highest engineering priorities. Case in point, rebuilding Azure docs entirely on Github. This has allowed an open feedback mechanism powered by Github issues.
As developers, we spend a lot of time developing APIs. Sometimes it’s to expose data that we’ve transformed or to ingest data from other sources. Coincidentally, more and more companies are jumping into the realm of API Management—Microsoft, Google, MuleSoft and Kong all have products now that provide this functionality. With this much investment from the big players in the tech industry, API management is obviously a priority. Now, why would anyone want to use an API Management tool?

The answer is simple: It allows you to create an API Gateway that you can load all your APIs into, providing a single source to query and curate. API Management makes life as an admin, a developer, and a consumer easier by providing everything for you in one package.

Azure API Management

Azure API Management logoWhat does Azure API Management provide? Azure API Management (APIM) is a cloud-based PaaS offering available in both commercial Azure and Azure Government. APIM provides a one-stop-shop for API authority, with the ability to create products, enforce policies, and utilize a robust developer portal.

Not only can API Management integrate seamlessly with your existing Azure infrastructure, but it can also manage APIs that exist on-prem and in other clouds. APIM is also available in both the IL4 and IL5 environments in Azure Government, which allows for extensibility and management for those working in the public sector.

APIM leverages a few key concepts to provide its functionality to you as a developer, including:

  • Products
  • Policies
  • Developer Portal

From providing security to leveraging rate-limiting and abstraction, Azure API Management does it all for API consolidation and governance in Azure. Any API can be ingested, and it gets even easier when APIs follow the OpenAPI Format.

What Are Products?

Products are a layer of abstraction provided inside APIM. Products allow you to create subsets of APIs that are already ingested into the solution—allowing you to overlap the use of APIs while restricting the use of individual collections of APIs. This level of compartmentalization allows you to not only separate your APIs into logical buckets but also enforce rules on these products separately, providing one more layer of control.

Product access is very similar to Azure RBAC—with different groups created inside of the APIM instance. These groups are yet another way for APIM admins to encapsulate and protect their APIs, allowing them to add users already associated to the APIM instance into separate subsets. Users can also be members of multiple groups, so admins can make sure the right people have access to the right APIs stored in their APIM instance.

What Are Policies?

Policies are APIM’s way of enforcing certain restrictions and providing a more granular level of control. There is an entire breadth of policies available in APIM, which range from simply disallowing usage of the API after calling it five times, to authentication, logging, caching, and transformation of requests or responses from JSON to XML and vice versa. Policies are perhaps the most powerful function of APIM and drive the control that everyone wants and need. Policies are written in XML and can be easily edited within the APIM XML Editor. Policies can also leverage C# 7 Syntax, which brings the power of the .NET Framework to your APIM governance.

What Is the Developer Portal?

The Azure API Management Developer Portal is an improved version of the Swagger documentation that’s generated when you use the OpenAPI spec. The Developer Portal provides an area for developers to readily see APIs, products, and associated applications. The Portal also provides sample request bodies (no more guessing API request structures!) and responses, along with code samples in many different languages.

Finally, the portal also allows you to try API calls with customized request bodies and headers, so you have the ability to see exactly what kind of call you want to make. Along with all that functionality, you can also download your own copy of the OpenAPI Spec for your API after it’s been ingested into your instance.

Why Should I Use APIM?

Every business should be using some form of API Management. You’ll be providing yourself a level of control previously not available. By deploying an API Gateway, that extra layer of abstraction allows for much tighter control of your APIs. Once an API has been ingested, APIM provides many additional functionalities.

First, you can match APIs to products, providing a greater level of compartmentalization. Second, you can add different groups to each product, with groups being subsets of users (i.e. Back-end Devs, Billing Devs, etc.). Third, you automatically generate a robust developer portal, which provides all of the functionality of the Swagger portal, but with added features, such as code snippets.  Finally, APIM also has complete integration with Application Insights in commercial Azure, providing access to a world-class logging and visualization tool.

Azure API Management brings power to the user, and no API should be left out.

One of the biggest roadblocks to government digital transformation is the lack of effective IT governance. Unresolved concerns including privacy, security and organizational silos that limit data sharing and analysis continue to pose hurdles for agencies.

Last night’s Azure Government Meetup in Washington, D.C. featured a stellar lineup of industry-leading experts who shared insights and strategies on achieving effective IT governance in areas including identity, portfolio and records management.

If you missed it, you can catch the replay hereRead More…

The Microsoft Government Tech Summit – a free, technical learning event for IT professionals and developers – is coming to Washington D.C., March 5-6, 2018! This two-day event will be packed with technical content, and this year Microsoft is showcasing Azure Government and Microsoft 365 for US Government.

Our Cloud Application Development Director, Brent Wodicka, is presenting this year on “A PaaS-First Approach to DoD Mission Apps” on March 5th at 1 p.m.  He will be co-presenting with Microsoft’s Derek Strausbaugh, and showcasing how Azure simplifies and re-imagines legacy mission applications. Registration is now open, and we’re hoping you can join us!

As the expectations of citizens increase, the need for technology innovation in government intensifies. Learn how cloud innovation can help meet the needs of the nation. Whether you’re interested in learning about security approaches or attracting and retaining talent with a more flexible and modern workstyle, Microsoft Government Tech Summit can help you evolve your skills and deepen your expertise to lead your agency through digital transformation.

What to expect:

  • Connect with experts from Microsoft and the community, and learn how to get the most from the cloud. Ask your toughest questions, learn best practices, and share strategies.
  • Choose from a variety of learning opportunities to deepen your cloud expertise, from keynotes and breakout sessions, to hands-on labs and a hackathon.
  • Customize your learning – whether you’re already cloud-savvy or just getting started – Microsoft Government Tech Summit has something for everyone.
  • Discover the latest trends, tools, and product roadmaps at more than 60 sessions covering a range of topics, including over 40 sessions focused on the needs of government agencies.

The cloud is changing expectations – and transforming the way we live and work. Join us at the Microsoft Government Tech Summit and learn how Microsoft’s cloud platform can help you lead your agency through digital transformation – and make the cloud part of your mission success.

REGISTER NOW

Last night’s AzureGov Meetup was an exceptionally fantastic one! The team challenged the Azure Government DC user community to create and share 12-minute demos to showcase cool tech that can help accelerate your cloud implementation.

We received a terrific response to the challenge and it resulted in a rock-star lineup of speakers, demos and an inside scoop on what’s moving the needle in government technology today. The presenters included:

  • Patrick Curran, Director, Federal Group, Planet Technologies
  • Mark Joscelyne, Head of Technical Operations, Public Sector, Frame
  • Deepak Mallya, Chief Cloud Architect, Cloudwave, Inc.
  • John Osborne, Principal OpenShift Solutions Architect, Red Hat
  • Steve Michelotti, Lead Dev Evangelist, Microsoft Azure Government

You can watch ALL the demos here at the archived livestream. (It’s available for a limited time, so check it out today!) Read More…

About this time last month, Microsoft announced that its Azure Government cloud platform received Authority to Operate (ATO) designations from both the U.S. Air Force and the U.S. Immigration and Customs Enforcement (ICE). The Air Force gave Azure Government the Defense Department‘s Impact Level 4 ATO, while ICE issued a FedRAMP High ATO.

In other words, that DoD Impact Level 4 ATO confirms that Azure Government complies with security standards required to host “controlled unclassified data for development, test and production environments within CCE.” The Air Force has already started to build a cloud infrastructure through the ATO and a shared application platform and hosting environment.

The FedRAMP High ATO authorizes Azure Government to handle ICE’s most sensitive unclassified data, including data that supports the agency’s core functions and protects against loss of life. The agency is currently implementing transformative technologies for homeland security and public safety, and the High ATO designation for Azure will allow them to innovate even faster.

This is great news for both the agencies and for Azure Government. We’ve helped large federal agencies make the move to the cloud using Azure tools, and while these migrations are always quite complex, we’ve actually streamlined the process down to five crucial steps: Compliance, envisioning, onboarding, deployment, and sustainment. 

AIS’ five-step DoD Cloud Adoption Framework is built on lessons we’ve learned from countless successful commercial and DoD secure cloud migrations and results in an expedited yet fully compliant process. We’re looking forward to helping many more agencies head to the cloud as a trusted Microsoft (and government) partner.

FREE HALF DAY SESSION: APP MODERNIZATION APPROACHES & BEST PRACTICES
Transform your business into a modern enterprise that engages customers, supports innovation, and has a competitive advantage, all while cutting costs with cloud-based app modernization.

While cloud is fast becoming the “new normal” for government, agencies are still challenged with the daunting task of IT modernization and developing a cohesive cloud migration strategy. Oftentimes, what’s holding back progress is that there simply isn’t a one-size-fits-all cloud playbook. That, combined with agency culture, hinders many agencies from making the move to cloud.

The November #AzureGov Meetup this week brought in both a packed house and a great lineup of government and industry experts who shared their best practices on critical components for cloud success, including: stakeholder engagement, evaluation, planning, implementation, outcomes…and the cultural changes you need to ensure a smooth transition.

We also celebrated the two year anniversary of the #AzureGov Meetup!


Read More…

We took Rimma Nehme’s excellent demo from BUILD 2017 and recreated it for AzureGov.

In a nutshell, we took the Marvel Universe Social Database and loaded it in Azure Cosmos DB as a graph database. Then we built a simple web page that invoked Gremlin queries against Cosmos DB.

The key theme of this demo is the ease with which you can create a globally distributed database that can support low latency queries against the Marvel Universe graph database. In the context of AzureGov (as shown below), we can seamlessly replicate the data across the three AzureGov regions by clicking on these regions within the Azure portal.

Here’s a quick look at the demo:

Earlier this week, my colleagues and I attended the 2017 Microsoft Government Cloud Forum at the Ronald Reagan building in D.C. This invitation-only event discussed topics such as IT modernization, cybersecurity, mobility, shared services, citizen engagement, and workforce management, all of which are top-of-mind these days for government employees.

AIS spent the day with Microsoft, government leaders and other partners, as we collaborated on how to both innovate and deliver more efficiently and effectively.

Lots of exciting news came out of the event, and we wanted to take a quick second to go over some of the bigger announcements: Read More…

For government, cybersecurity isn’t just a challenge—it can also be a roadblock that stands in the path of a long-awaited digital transformation.

Agencies are tasked with balancing the highest security measures with innovation, as today’s employees and citizens alike insist that their data be conveniently available anytime, anywhere. Where do you even start?

Last night’s AzureGov Meetup had a BIG audience hoping to get an answer to that question!

Read More…