Blazor is coming! In fact, it’s coming sooner than later. Check out ASP.NET’s blog post from April 18, 2019 announcing the official preview.

What is Blazor?

by Vivek Gunnala 

Blazor is a new interactive .Net Web framework, which is part of the open-source .Net platform. Blazor uses C#, HTML, CSS and Razor components instead of JavaScript. It’s built on open web standards without the need for any plugins or code transpilation, and it works on all modern web browsers, hence called “.Net in the browser”, the C# code is directly run on the browser using WebAssembly. Both client-side code and server-side code is developed in C#, which allows you to reuse code and libraries between both sides, such as validations, models, etc.

Apps built in Blazor can use existing .Net libraries by leveraging .Net Standard, allowing the same code to be used across platforms. Since it is an experimental project, Blazor is evolving rapidly with over 60,000 contributors.

About WebAssembly

At a high-level, WebAssembly is explained on the official site as, “a binary instruction format a stack-based virtual machine. It is designed as a portable target for compilation of high-level languages, enabling deployment on the web for client and server applications.”

Should I Use Blazor For My Next Project?

by Ash Tewari 

Blazor’s development status has been promoted from an “Experimental” project to a committed product. This is great news. Blazor is available now as an official preview. Let’s review the factors you should consider when making decisions about adopting Client-Side Blazor for your next production project.

Mono.wasm (The .NET runtime compiled into WebAssembly executing your .NET assemblies in the browser) does not interact with the DOM directly. It goes through JS Interop, which is expensive. The areas where .NET Code will get the most net benefit is in the Model and Business Logic, not the DOM manipulation. If your application is very chatty with the DOM, you might need to carefully assess whether you are getting the expected performance boost from WebAssembly execution of your .NET assemblies. [https://webassemblycode.com/webassembly-cant-access-dom/]

Currently, only the mono runtime is compiled to WebAssembly. Your .NET code is executed as-is. This means that your .NET code is essentially going through two interpreters and it has a noticeable performance impact. There is work being done to compile .NET assemblies to wasm. That and other related improvements in linking and compiling) is expected to improve the performance. The fact that Microsoft has decided to commit Blazor as a product indicates that there is confidence that these performance improvements are likely to become a reality.
[https://www.mono-project.com/news/2018/01/16/mono-static-webassembly-compilation/, https://github.com/WebAssembly/reference-types/blob/master/proposals/reference-types/Overview.md]

In the client-side hosting model, your code is still running in the browser sandbox. So, you don’t have any access to FileSystem and other OS libraries. This limitation applies to javascript as well. In fact, WebAssembly is executed by the Javascript runtime. Yes, the same runtime which is executing the javascript in your web application.
[https://medium.com/coinmonks/webassembly-whats-the-big-deal-662396ff1cd6]

Well, if WebAssembly is executed by the same Javascript runtime, then where are the performance gains everyone is touting about coming from? The answer is that the performance gains come from skipping the parsing steps and/or optimizing compilation steps. The WebAssembly is decoded and JITed instead of parsed and compiled before the JIT step. However, there is work still ongoing to make .NET IL interpretation reach the performance levels required to fulfill the promises
[https://hacks.mozilla.org/2017/02/what-makes-webassembly-fast/]

Remember that your Blazor code executes in the UI thread of the browser, which can create a bottleneck if your application is CPU bound. Ironically, the CPU/computationally intensive applications are also one of the most compelling use-cases for Blazor. You may need to look into running Blazor components in the Web Worker. We will cover this in a separate blog post dedicated to this technique.

Server-Side Blazor

by Sean McGettrick 

Server-side Blazor, previously referred to as Razor Components, allows developers the same freedom to create UI components using C# instead of Javascript that client-side Blazor does. The primary difference being that the code is hosted on the server instead of the browser. Blazor components and application logic written to run client-side can also be used server-side.

Razor Components support all the functionality a front-end developer would expect in a modern library including:

  • Parameterization
  • Event handling
  • 2-way data binding
  • Routing
  • Dependency injection
  • Layouts
  • Templating
  • CSS cascading

Razor Components can be nested and reused, similar to React.

Differences from Client-Side

With server-side Blazor, all components are hosted and served from an ASP.NET Core server instead of being run in the browser via WASM. Communication between client and server are handled via SignalR.

Further differences between client and server-side Blazor will be outlined in the next two sections.

Advantages

Server-side Blazor offers a number of advantages over its client-side counterpart. These include:

  • No WASM dependencies. Older desktop browsers and some current mobile browsers lack support for WASM. Since server-side Blazor only requires the browser to be able to support Javascript it can run on more platforms.
  • Building on the last point, since the components and application logic sit server-side, the application is not restricted to the capabilities of the browser.
  • Developing the application on an entirely server-based platform allows you access to more mature .NET runtime and tooling support.
  • Razor components have access to any .NET Core compatible API.
  • Application load times in the browser are faster due to a smaller footprint. Only the SignalR Javascript code required to run the application is downloaded to the client.

Disadvantages

There are, however, some disadvantages to using server-side Blazor:

  • There is higher application latency due to user interactions requiring a network round-trip between the browser and the server.
  • Since the application is entirely hosted on the server, there is no offline support. If the server goes down, the application will not function which breaks one of the core tenets of building a Progressive Web Application (“Connectivity independent: Service workers allow work offline, or on low-quality networks”).
  • With the server being responsible for maintaining client state and connections, this can create difficulty in scaling the application since the server is doing all the work.
  • The application must be hosted on an ASP.NET Core server.

Server-Side Blazor Code Re-Use, Razor Pages to Blazor using an MVVM approach

by Adam Vincent 

What is MVVM?

In a nutshell, MVVM is a design pattern derived from the Model-View-Presenter (MVP) pattern. The Model-View-Controller (MVC) pattern is also derived from MVP, but where MVC is suited to sit on top of a stateless HTTP protocol, MVVM is suited for user interface (UI) platforms with state and two-way data binding. MVVM is commonly implemented in Desktop (WPF / UWP), Web (Silverlight), and Mobile (Xamarin.Forms) applications. Like the other frameworks, Blazor acts much like a Single Page Application (SPA) that has two-way data binding and can benefit from the MVVM pattern. So whether you have existing MVVM code in the form of a WPF or mobile application, or are starting green with new code, you can leverage MVVM to re-use your existing code in Blazor, or share your code with other platforms.

You can find more information on MVVM on Wikipedia.

Example Presentation Layer

BindableBase 

At the heart of MVVM is the INotifyPropertyChanged interface which notifies clients that a property has changed. It is through this interface that converts a user interaction into your code being called. Usually, all ViewModels, and some Models will implement INotifyPropertyChanged therefore, it is common to either use a library (Prism, MVVM Light, Caliburn) or to create your own base class. What follows is a minimal implementation of INotifyPropertyChanged.

public abstract class BindableBase : INotifyPropertyChanged
{
    protected bool SetField<T>(ref T field, T value, [CallerMemberName] string propertyName = null)
    {
        if (EqualityComparer<T>.Default.Equals(field, value)) return false;
        field = value;
        OnPropertyChanged(propertyName);
        return true;
    }
 
    public event PropertyChangedEventHandler PropertyChanged;
 
    protected void OnPropertyChanged([CallerMemberName] string propertyName = null)
    {
        PropertyChanged?.Invoke(this, new PropertyChangedEventArgs(propertyName));
    }
}

In this simplified model class, which derives from BindableBase, we have a CustomerModel with a single property FirstName. In this context we would probably have a customer filling out an input within a form on a website where they must fill in their first name. This input would be bound to an instance of CustomerModel on the ViewModel. While the customer is filling out the form, since we are in a two-way data binding scenario, each time the customer enters or removes a character from the form’s input box, SetField() is called and will cause the PropertyChanged event to fire.

public class NewCustomerModel : BindableBase
{
    private string firstName;
    
    public string FirstName
    {
        get => firstName;
        set
        {
            SetField(ref firstName, value);
        }
    }
}

Learn More: If you need to know more about INotifyPropertyChanged the Microsoft Docs cover this topic very well.

Model

With INotifyPropertyChanged out of the way, here is the entire presentation model.

public class NewCustomerModel : BindableBase
{
    [Display(Name = "Customer Number")]
    public string CustomerNumber { get; set; }
 
    [Display(Name = "Full Name")]
    public string FullName => $"{FirstName} {LastName}";
 
    private string firstName;
    [Required]
    [Display(Name = "First Name")]
    public string FirstName
    {
        get => firstName;
        set
        {
            SetField(ref firstName, value);
            OnPropertyChanged(nameof(FullName));
        }
    }
 
    private string lastName;
    [Required]
    [Display(Name = "Last Name")]
    public string LastName
    {
        get => lastName;
        set
        {
            SetField(ref lastName, value);
            OnPropertyChanged(nameof(FullName));
        }
    }
 
    [Display(Name = "Address")]
    public string Address => $"{Street}, {City}, {State} {PostalCode}";
 
    private string street;
 
    [Required]
    [Display(Name = "Street Address")]
    public string Street
    {
        get => street;
        set
        {
            SetField(ref street, value);
            OnPropertyChanged(nameof(Address));
        }
    }
    private string city;
 
    [Required]
    [Display(Name = "City")]
    public string City
    {
        get => city;
        set
        {
            SetField(ref city, value);
            OnPropertyChanged(nameof(Address));
        }
    }
    private string state;
 
    [Required]
    [Display(Name = "State")]
    public string State
    {
        get => state;
        set
        {
            SetField(ref state, value);
            OnPropertyChanged(nameof(Address));
        }
    }
    private string postalCode;
 
    [Required]
    [Display(Name = "Zip Code")]
    public string PostalCode
    {
        get => postalCode;
        set
        {
            SetField(ref postalCode, value);
            OnPropertyChanged(nameof(Address));
        }
    }
}

There are a few things to point out in this presentation model. First, please note the use of the Data Annotation attributes such as [Required]. You can decorate your properties to provide rich form validation feedback to your users. When the customer is filling out a form and misses a required field it will not pass the model validation. This will prevent the form from being submitted as well as provide an error message if one is configured. We will cover this more in the View section

The next thing I wanted to point out is I’ve covered SetField() in the INotifyPropertyChanged section, but there is an additional bit of complexity.

[Display(Name = "Full Name")]
public string FullName => $"{FirstName} {LastName}";

Note that the FullName property is a { get; }-only concatenation of the customer’s first and last name. Since we are forcing the customer to fill out first and last name in a separate form field, changing either the first or last name causes the FullName to change. We want the ViewModel to be informed of any changes to FullName.

private string firstName;
[Required]
[Display(Name = "First Name")]
public string FirstName
{
    get => firstName;
    set
    {
        SetField(ref firstName, value);
        OnPropertyChanged(nameof(FullName));
    }
}

After the SetField() is invoked in the base class, there is an additional call to OnPropertyChanged(), which lets the ViewModel know that in addition to FirstName changing, FullName has also changed.

Example ViewModel Interface

The example ViewModel below will expand on the model above. We’ll be using a simplified user story of “Creating a New Customer.”

Blazor supports .NET Core’s dependency injection out of the box, which makes injecting a ViewModel very simple. In the following ViewModel interface, we’ll need our concrete class to have an instance of NewCustomer as well as a method which knows how to create a new customer.

public interface ICustomerCreateViewModel
{
    NewCustomerModel NewCustomer { get; set; }
    void Create();
}

And the concrete implementation of ICustomerCreateViewModel:

public class CustomerCreateViewModel : ICustomerCreateViewModel
{
    private readonly ICustomerService _customerService;
 
    public CustomerCreateViewModel(ICustomerService customerService)
    {
        _customerService = customerService;
    }
 
    public NewCustomerModel NewCustomer { get; set; } = new NewCustomerModel();
 
    public void Create()
    {
        //map presentation model to the data layer entity
        var customer = new NewCustomer()
        {
            CustomerNumber = Guid.NewGuid().ToString().Split('-')[0],
            FullName = $"{newCustomer.FirstName} {NewCustomer.LastName}",
            Address = $"{newCustomer.Address}, {NewCustomer.City}, {newCustomer.State} {NewCustomer.PostalCode}"
        };
 
        //create
        _customerService.AddNewCustomer(customer);
    }
}

ViewModel Deep-Dive

In the constructor, we’re getting an instance of our ICustomerService which knows how to create new customers when provided the data layer entity called NewCustomer.

I need to point out that NewCustomer and NewCustomerModel serve two different purposes. NewCustomer, a simple class object, is the data entity used to persist the item. NewCustomerModel is the presentation model. In this example, we save the customer’s full name as a single column in a database (and is a single property in NewCustomer), but on the form backed by the NewCustomerModel presentation model, we want the customer to fill out multiple properties, ‘First Name’ and ‘Last Name’.

In the ViewModel, the Create() method shows how a NewCustomerModel is mapped to a NewCustomer. There are some tools that are very good at doing this type of mapping (like AutoMapper), but for this example the amount of code to map between the types is trivial. For reference, what follows is the data entity.

public class NewCustomer
{
        public string CustomerNumber { get; set; }
        public string FullName { get; set; }
        public string Address { get; set; }
}

Opinionated Note: Presentation models and data entities should be separated into their respective layers. It is possible to create a single CustomerModel and use it for both presentation and data layers to reduce code duplication, but I highly discourage this practice.

View

The last and final piece to the MVVM pattern is the View. The View in the context of Blazor is either a Page or Component, which is either a .razor file, or a .cshtml file and contains Razor code. Razor code is a mix of C# and HTML markup. In the context of this article, our view will be a customer form that can be filled out. There is also a button that calls the ViewModel’s Create() method when the form has been filled out properly according to the validation rules.

@page "/customer/create"
@using HappyStorage.Common.Ui.Customers
@using HappyStorage.BlazorWeb.Components
@inject Microsoft.AspNetCore.Components.IUriHelper UriHelper
@inject HappyStorage.Common.Ui.Customers.ICustomerCreateViewModel viewModel
 
<h1>Create Customer</h1>
 
<EditForm Model="@viewModel.NewCustomer" OnValidSubmit="@HandleValidSubmit">
    <DataAnnotationsValidator />
    <ValidationSummary />
    <div class="form-group">
        <h3>Name</h3>
        <LabelComponent labelFor="@(() => viewModel.NewCustomer.FirstName)" />
        <InputText class="form-control" bind-Value="@viewModel.NewCustomer.FirstName" />
 
        <LabelComponent labelFor="(() => viewModel.NewCustomer.LastName)" />
        <InputText class="form-control" bind-Value="@viewModel.NewCustomer.LastName" />
    </div>
    <div class="form-group">
        <h3>Address</h3>
 
        <LabelComponent labelFor="@(() => viewModel.NewCustomer.Street)" />
        <InputText class="form-control" bind-Value="@viewModel.NewCustomer.Street" />
 
        <LabelComponent labelFor="@(() => viewModel.NewCustomer.City)" />
        <InputText class="form-control" bind-Value="@viewModel.NewCustomer.City" />
 
        <LabelComponent labelFor="@(() => viewModel.NewCustomer.State)" />
        <InputText class="form-control" bind-Value="@viewModel.NewCustomer.State" />
 
        <LabelComponent labelFor="@(() => viewModel.NewCustomer.PostalCode)" />
        <InputText class="form-control" bind-Value="@viewModel.NewCustomer.PostalCode" />
    </div>
    <br />
    <button class="btn btn-primary" type="submit">Submit</button>
    <button class="btn" type="button" onclick="@ReturnToList">Cancel</button>
</EditForm>

The first thing to note is at the top of the code. This is how we use dependency injection to get an instance of our ViewModel.

@inject HappyStorage.Common.Ui.Customers.ICustomerCreateViewModel viewModel

Easy! Next, we need to create the form. The  needs an instance of a model to bind to, our NewCustomer ViewModel, and a method to call when the user submits a valid form.

<EditForm Model="@viewModel.NewCustomer" OnValidSubmit="@HandleValidSubmit">
...
</EditForm>

Next, we bind each property to their respective input fields. Blazor has some built-in   helpers which help you accomplish the binding. They are still under development and you may find some features are lacking at the time of writing. Please refer to the docs in the note below for more up-to-date info.

Note: The  is something I’ve created as a replacement for the asp-for  tag-helper that retrieves the DisplayAttribute from the presentation model classes. That code is available in the GitHub repository listed at the top.

<LabelComponent labelFor="@(() => viewModel.NewCustomer.FirstName)" />
<InputText class="form-control" bind-Value="@viewModel.NewCustomer.FirstName" />
 
<LabelComponent labelFor="(() => viewModel.NewCustomer.LastName)" />
<InputText class="form-control" bind-Value="@viewModel.NewCustomer.LastName" />

The magic here is bind-Value which binds our  text box to the value of the ViewModel’s instance of the NewCustomerModel presentation model.

Note: You can view full documentation on Blazor Forms and Validation here.

Last but not least, we’ll need some code to call our ViewModel’s Create() method when the form is submitted and valid. We’ll also need the onclick=ReturnToList I’ve defined for the Cancel button.

@functions {
    private void HandleValidSubmit()
    {
        viewModel.Create();
        ReturnToList();
    }
 
    private void ReturnToList()
    {
        UriHelper.NavigateTo("/customers");
    }
}

Conclusion

That’s it! In summary, I’ve covered what MVVM is, how Blazor can benefit from it, as well as an in-depth look at a simple example of how we can create a form with validation and rich feedback to the user. It is also important to reiterate that this example works not only in Blazor but can also be used in Windows Presentation Foundation (WPF) desktop applications as well as on other platforms. Please check out the GitHub repository as I continue to develop and expand on this concept.

Developer Gotchas

by Morgan Baker 

Working with a new framework like Blazor always has its learning experiences. The goal of this section is to help alleviate headaches by providing common problems and solutions we encountered with Blazor.

  • My localization isn’t working!
    For this problem, check your route parameters. Depending on the type of parameter, the invariant culture is used by the route by default, allowing for no localization for URLs. This can be solved by allowing the parameter to be passed in as any type, and then validating the type in C# code before using it.
  • I can’t debug my C# code!
    Server-side debugging for Blazor doesn’t exist yet, but you’ll still be able to debug the whole application (assuming your server-side is using ASP.NET Core).
  • I can’t see my C# in the browser!
    C# code in Blazor is compiled through WebAssembly before being delivered to the browser. When this happens, the C# can’t be displayed in the browser. However, you can still see the code in Chrome through remote debugging. Follow these steps.
  • Why isn’t my new route working?
    Most of the time you’ll need to rebuild the application to get new routes on development applications. Other causes might be naming problems or a problem with the route parameter types.
  • Everything seems to be loading slow
    This can be multiple issues, some of which are not Blazor-specific. However, for the Blazor-specific issues, it varies between server and client. Any page using server-side Blazor must make an HTTP call to the server, which deals a hit to performance. Any site using client-side Blazor will have a long initial load time, then be more relaxed later.
  • I’m seeing a blank page and I set everything up correctly!
    This is a specific one that I ran into when first using the templates in Visual Studio 2019. The solution was making sure I had the right .NET Core SDK installed. You can have the wrong version and still create a Blazor website with no errors, at least until the app starts running. You can install the latest version of the .NET Core SDK here.

Online Resources

by JP Roberts III 

As of the writing of this blog post, Blazor is still a new framework, and as such, is still changing rapidly. Pluralsight doesn’t have any courses covering Blazor, Udemy only has a couple of short videos, and Microsoft’s Learning site has no specific courses dedicated to Blazor.

However, there are several websites that have a good deal of information and samples for developers:

YouTube also has several informative videos on Blazor, such as a two-part series on the Microsoft Visual Studio channel: Blazor – Part 1 , Blazor – Part 2.

In this episode of the Azure Government video series, Steve Michelotti sits down with AIS’ very own Vishwas Lele to discuss migrating and modernizing with Kubernetes on Azure Government. You’ll learn about the traditional approaches for migrating workloads to the cloud, including:

1. Rehost
2. Refactor
3. Reimagine

You will also learn how Kubernetes provides an opportunity to fundamentally rethink these traditional approaches to cloud migration by leveraging Kubernetes in order to get the “best of all worlds” in the migration journey. If you’re looking to migrate your existing legacy workloads to the cloud, while minimizing code changes and taking advantage of innovative cloud-native technologies, this is the video you should watch!

WORK WITH THE BRIGHTEST LEADERS IN SOFTWARE DEVELOPMENT

Meet some of the AIS Recruiting Team – They’re going to talk you through some of their top recommended job interview tips.

(Transcript)

My name is Francesca Hawk. My name is Rana Shahbazi. My name is Kathleen McGurk. My name is Jenny Wan. My name is Denise Kim.

Tip #1: Be Open, Transparent & Direct

I think it’s important for candidates to be authentic and transparent throughout the entire interview process.

Keeping the line of communication open through the interview process is really important for both sides. If you have other opportunities on the table, say that. The recruiters are your advocates and an essence kind of your best friend. Being direct – give us, you know, enough feedback – if you are not interested, or if you if the commute is an issue, or if you want more money, if your clearance was an issue – just let us know.

Tip #2: Know What You Want

So before even searching for opportunities you have to figure out what you’re looking for in a company. And then once you figure out what you’re looking for – whether it’s the culture of the company, the location the company – definitely asked questions with the recruiter prior to the interview so while you’re at the interview you have a little bit of that information.

Tip #3: Be On Time & Be Prepared

You always want to make sure you’re on time. Generally, you want to arrive about 15 minutes before your interview. You know where you’re going to park, make sure that you look up directions ahead of time. And just be prepared in general.

Preparation is extremely underrated in the interview process, so really doing your research getting familiar with the company and the culture there. Go online. Check out, you know, the general website, check out the job description. Make sure you’re aware of the skills and qualifications and what they’re really looking for. Glassdoor always provides really good reviews from the current employees. I think the company website and certainly LinkedIn is a huge aspect – social media in general.

Tip #4: Ask Questions

Ask questions or have questions ready to us ask. Ask about the process ask about the expectations who you’ll be potentially meeting with, what the potential duration could be. The company can’t provide information unless you ask for it.

You also have to make sure that you are interviewing the company just as much as they’re interviewing you. Ask the interviewers is about the culture because you’re going to get a different response from everybody but if they all seem to check out or are the same then that means the culture is pretty good.

Just make sure that you feel comfortable with the environments that, you know, you’re going to be working in.

Tip #5: Make Sure You Understand the Role

Really use the opportunity to understand the position and then to sell your strengths and also kind of tie it back into your accomplishments.

Make sure that you talk about what you were individually able to accomplish in a project. So you were personally able to
bring to the table and not necessarily what the team accomplished as a whole.

Tip #6: Show Your Interest

I think your presentation and the way you present yourself to the interviewers and anybody that you interact within the interview process is extremely important.

So not just what you say, but how you say it. Eye contact and body language say a lot about your interest in the position and the company as a whole.

Showing your interest makes a recruiter feel that you’re confident and that you can certainly do the role, and also that you are
excited about this opportunity.

I think you should be excited about interviewing a company that you’re interested in. And that sounds silly, but I think that going in excited and I think that’s why body language and eye contact are all very important aspects.

Tip #7: Listen

People are so busy thinking about what they’re going to say next that they don’t actually pay attention to the questions being asked.

So making sure that you’re hearing what they’re saying and then taking the time to respond is really important.

Tip #8: Follow Up

Certainly, you know, asking for next steps is very helpful and also that is another way of expressing your interest. You know, definitely being responsive. I would say the general rule of thumb is within 12 hours of turnaround time. If you’re not interested
and that’s okay if we’re not at AIS where this opportunity is not number one and that’s okay, we like to know that as well.

You definitely want to send a thank you note – it goes a long way and it shows you’re very interested in the company and it always leaves a great impression.

We’re Hiring!

AIS is always looking to connect with talented technologists that are passionate about learning and growing to staff exciting new projects for our commercial and federal clients. If you’re interested in working at AIS, check out our current career openings.

We’re proud to announce that AIS has successfully renewed all six of our Microsoft Gold Partner competencies for 2019. AIS has been consistently recognized as a Microsoft Gold Partner for many years now, and we’re currently distinguished at the Gold level for:

    • DevOps
    • Cloud Platform
    • Cloud Productivity
    • Application Development
    • Application Integration
    • Collaboration and Content

Microsoft Gold Partner Logo

The Microsoft Partner Program: Defining the Levels of Excellence

Each of these achievements is an important benchmark in the competitive world of Microsoft technology partners. Every year, Microsoft evaluates our staff, our project history, and our customer references. A single Gold competency requires employees to hold multiple Microsoft Certified Professional (MCP) certifications, five in-depth customer references, numerous developer exams, and other objectives.

We’re proud that over 70% of our staff maintains relevant certifications, validating our knowledge and expertise and allowing us to reach the Gold level across so many areas of our business. Congrats to the entire AIS team for once again bringing home the Gold!

Interested in learning more about our involvement as a certified Microsoft Gold Partner? Click here to get in touch with a solutions executive or give us a call today at 703-860-7800.

Calling all developers, tech professionals, and IT and business leaders! February 4-5, 2019, Microsoft is hosting the Ignite the Tour DC event in Washington, D.C. at the Walter E. Washington Convention Center.

This event is government-focused, delivering 100+ deep-dive sessions and workshops from over 350 professionals to help you meet your mission. The event is FREE, but you will need a ticket. (Note, this is currently sold out, but you can join the waitlist here.)

About the Session: Migrate and Modernize with Kubernetes in Azure Government

CTO, Vishwas Lele will be joined by Microsoft’s Steve Michelotti to present on the topic “Migrate and Modernize with Kubernetes in Azure Government” Tuesday, February 5, 2019, from 12:50 PM to 1:50 PM.

If you are overwhelmed by the daunting prospects of migrating your on-premises workloads to the cloud, confused by what approaches to take, or torn between doing a lift-and-shift to the cloud versus modernizing your architectures — this session is for you!

During this session, we’ll show you how you can utilize cloud-native technologies to migrate your workloads to Azure to realize significant cost savings, requiring minimal code changes, moving your organization a step closer to modernization using cloud-native technologies. The presentation will be demo-heavy, giving you an inside look at using Kubernetes to migrate your workloads to Azure Government.

Stop by Booth #58 to See Us

We hope to see you there! You can find us at Booth #58, the closest one to the “Fun Lounge.” Already attending? Let us know you’re coming — we can schedule some time to talk.

Last week we laid out some basics of what we call the “Full PaaS” approach to legacy app modernization. While it might not make sense in every situation, we recently completed a modernization effort using the Full PaaS approach. Here’s some background and the steps we took…

Stop Playing Legacy App “Whack-a-Mole” 

Our enterprise customer developed and owned a budgeting application. The application was over five years old and built on tech that — while modern at the time — had become “stale” over the years.  Usage patterns for the application included huge spikes in demand during specific times of the year, and the need to meet these demands prompted the team to “reactively” invest in servers with more memory, better networking equipment, and other fixes. Problems were only addressed as they cropped up, with no time for long-term planning.

Yet despite throwing money and equipment at those problems, the issues with the platform continued, while customers were demanding more functionality. Unfortunately, since most of the application team’s time was spent reacting to operating issues, that simply couldn’t happen. Additionally, the application team’s O&M budget shrank over time, leaving a smaller staff responsible for the application.

After analyzing the application, we determined that three tiers of the application (compute, cache, database) could be moved to the “as-a-service” model with a reasonable amount of refactor, for two main reasons:

  • This model would address the seasonal demand challenges by leveraging “auto-scaling” capabilities built directly into the services the application would now be consuming.
  • As an added benefit, these services allowed scripted deployments, automated monitoring, and easier provisioning of “test” environments to get new features in the hands of users more quickly.

The 7 Steps to “Full-PaaS” Modernization

Once we chose the Full PaaS approach, we completed the modernization effort by following these seven (high level) steps:

  1. Analyze application dependencies: This includes compute and data tiers, software architecture, reliance on existing resources.
  2. Identify services to replace legacy components: Not everything will port directly, so map out your replacements ahead of time.
  3. Establish candidate PaaS architecture: Choose your cloud platform, specific services to be used, and architect the flow of communication between the services.
  4. Validate with internal stakeholders: We talk to everyone from operations staff to security to business users of the application.
  5. Refactor your application code: Target the PaaS services included in the candidate architecture.
  6. Automate your application delivery and integrate CI/CD: This is one of the biggest benefits to this modernization approach, so take full advantage of it!
  7. Establish a living roadmap for ongoing improvement: We’re looking for both ongoing improvements to delivery automation and for any additional applications that can also adopt the model.

FREE HALF DAY SESSION: APP MODERNIZATION APPROACHES & BEST PRACTICES
Transform your business into a modern enterprise that engages customers, supports innovation, and has a competitive advantage, all while cutting costs with cloud-based app modernization.

If you’re looking to modernize a legacy application, there are quite a few paths and approaches to choose from. Today, let’s look at “Full PaaS” modernization, which re-architects legacy applications to target cloud-native “serverless” technologies wherever possible.

This can solve many of the most common challenges organizations face when dealing with legacy applications:

  • The need to provide modern capabilities, innovate faster with limited resources
  • Your existing infrastructure is expensive and difficult to provision, maintain, scale, secure
  • Your existing staff could improve efficiency by focusing on their strengths
  • Your customers expect evolving, innovative functionality that relies on expensive, complex underlying technology

Why “Full PaaS” Modernization Is Different

Going the “Full PaaS” route allows your company to take advantage of the “best of the best” that the public cloud has to offer, including:

  • The responsibility for delivering platform quality shifts from IT staff to industry experts (uptime, security, etc.)
  • Managed offerings for all application components provide peace of mind and zero day-to-day maintenance.
  • You can immediately and automatically leverage the innovation and improvements being applied (almost constantly) to the underlying cloud platform.

Once your application is migrated, you’ll enjoy vastly improved speed, flexibility and agility. Modern PaaS platforms offer opportunities to automate and extend delivery processes as a core feature of the service, which creates a lower barrier to entry to incorporate modern, innovative technology for improvement of your software products.

And of course, we can’t talk about moving applications to the cloud without mentioning the cost savings and lower total cost of ownership. You’ll eliminate significant effort required to build, maintain, and evolve the “foundation” of your application’s infrastructure (servers, networking equipment, monitoring stack, data backup and disaster recovery, etc.). This, in turn, will let you focus your development resources on core competency, avoid inefficiencies related to effort not directly focused on improving the quality of your software products.

Sounds Great! Any Drawbacks? 

Well, yes. There are a few things to consider before taking this approach:

  • More significant application refactor or re-architecture could be required, which can include more significant up-front investment.
  • More potential for vendor lock-in specific to a specific cloud provider than other modernization approaches.
  • Existing staff may need to invest in modernizing existing skillsets and deeply ingrained ideas about “the way things are done” – instead emphasizing “software-defined” principles (networking, automation, monitoring).

This approach targets the highest level of maturity for cloud adoption, where you’re consuming cloud native features as a service to provide all of the building blocks for your application.  This won’t fit right away for all situations, as organizations balance an application’s internal dependencies with the need to modernize.  However, this approach can provide significant benefit for legacy application teams given the right circumstances.

FREE HALF DAY SESSION: APP MODERNIZATION APPROACHES & BEST PRACTICES
Transform your business into a modern enterprise that engages customers, supports innovation, and has a competitive advantage, all while cutting costs with cloud-based app modernization.

So Is “Full PaaS” the Right Approach For Me?

AIS looks for the following characteristics when evaluating this approach:

  • The team is willing to invest a bit more time and money up front to modernize in return for the benefits listed above.
  • The team has significant challenges managing infrastructure which aren’t fully addressed by more basic lift-and-shift or “containerization” app modernization approaches. In many cases this comes from a slow erosion of operations and maintenance (O&M) staff over time, leaving developers responsible for all portions of development and delivery.
  • The team is interested in providing evolving features, but is constrained by the lack of innovation on the current platform.
  • The team releases updates/features too infrequently and is under pressure to improve the “cycle time.”

Next week, we’ll take a look at the specific steps involved in the “Full PaaS” modernization approach, and share an example of a successful legacy app modernization project the AIS Team recently completed.

In support of National Cybersecurity Awareness Month, you’re invited to join us for a very special edition of the Microsoft Azure Government Meetup!

On October 24th, the Women Leading Government Cybersecurity Meetup will feature an exciting panel of experienced government cybersecurity professionals. ALL are welcome to attend this FREE event for networking, refreshments and fascinating insights and discussions on:

  • Today’s cybersecurity landscape in government and top priorities
  • Best practices in cybersecurity along with challenges and lessons learned
  • Future of cybersecurity in government and the cyber workforce

The evening will wrap up with a security demo along with Q&A.

This Meetup is presented in partnership with the Women in Technology D.C. Chapter of the International Association of Microsoft Channel Partners, a community of local professionals that believe in making it easier for women to imagine, begin and develop a career in IT. We’re very excited about this one, so check out the full agenda, speaker bios and RSVP here!

If you’re not in the D.C. metro area, you can join us via livestream on Oct. 24 starting at 6:35 p.m. at aka.ms/azuregovmeetuplive. And be sure to join our conversation on Twitter using #AzureGovMeetup.

FREE ONE-HOUR RISK CONSULTATION
If disaster strikes, would your organization survive? Let AIS help you plan for the unknown.

AIS’ work with the NFL Players Association (NFLPA) was showcased as a Microsoft Featured Case Study. This customer success story was our most recent project with NFLPA, as they’ve sought our help to modernize multiple IT systems and applications over the years. We were proud to tackle the latest challenge: Creating a single, shared player management system, using Dynamics 365, for the NFLPA and all its sister organizations.

The Challenge

This case study was featured on Microsoft. Click here to view the case study.As the nonprofit union for NFL players, the NFLPA constantly looks for ways to better serve its members—current and former NFL players—during and after their football careers. But multiple player management systems across the associated support organizations resulted in poor customer service and missed opportunities for NFLPA members. Valuable data captured by one department wasn’t accessible to another, causing headaches and delays when licensing opportunities arose, and limited the organization’s ability to be proactive about the challenges members face after retirement.

The Solution: A Single Source

We used Microsoft Dynamics 365 to create a single, shared player management system, called PA.NET, for all the NFLPA organizations. We customized Dynamics 365 extensively to meet the unique needs of the NFLPA and integrated it with the organization’s Office 365 applications.

At the same time, we shifted all legacy IT systems (websites, financial applications, and others) to Microsoft Azure, giving NFLPA an entirely cloud-based business.

The Results: More Opportunities, More Time, Fewer Costs

With one master set of player data and powerful reporting tools that employees use to find answers to their own questions, the NFLPA can uncover marketing and licensing opportunities for more players and identify other ways to help its members.

Because PA.NET automates so many previously manual processes, it frees up hours of drudge work each week for NFLPA employees, which they convert to creative problem solving for members. And its IT staff has freed up 30 percent more time by not having to babysit infrastructure, time it uses to come up with new technology innovations.

By moving its business systems to the cloud, the NFLPA can scale its infrastructure instantly when traffic spikes—such as when football season ends and licensing offers heat up. No more over-provisioning servers to meet worst-case needs. In fact, no more servers, period. With cloud-based systems, the NFLPA no longer has to refresh six-figure server and storage systems every few years.

Read the full Microsoft Featured Case Study here to learn more about our work and more about great work the NFLPA does on behalf of its members.

SCORE LIKE NFLPA. WORK WITH AIS. Transformation is on the horizon for your organization. All it takes is the right partner. With the experience, talent, and best practices to lead you to success, AIS is the right partner for you.