The business intelligence, automation, and enterprise application landscape is changing dramatically.

In the previous incarnation of enterprise technology, line-of-business owners were forced to choose between pre-baked commercial off the shelf (COTS) software, which was difficult to customize and often did not truly meet the business’s unique needs, or custom solutions that (though flexible and often tailor-made to the business needs of the moment) cost more and were far riskier to develop and deploy.

Furthermore, certain classes of applications do not have a COTS answer, nor do they justify the cost of custom software development. In the chasm between the two arose a generation of quasi-apps: the homegrown Excel spreadsheets, Access databases, Google docs, and all manner of other back-of-the-napkin “systems.” End users developed these quasi-apps to fill the gaps between the big software IT provided and what users actually needed to do their jobs.

We’ve all been there: The massive spreadsheet that tracked a decade’s worth of employee travel but was always one accidental click away from oblivion. Or the quirky asset management database living on your officemate’s desktop (and still named after an employee who left the company five years ago); the SharePoint site full of sensitive HR data, or the shared network drive that had long been “shared” a bit too liberally. A generation of do-it-yourself workers grew up living on the edge of catastrophe with their quasi-apps.

Thankfully, three trends have converged to shatter this paradigm in 2019, fundamentally changing the relationship between business users, technologists, and their technology.

Connectivity of Everything

The new generation of business applications is hyper-connected to one another. They allow for connections between business functions previously considered siloed, unrelated, or simply not feasible or practical. This includes travel plans set in motion by human resources decisions, medical procedures scheduled based on a combination of lab results and provider availability, employee recruiting driven by sales and contracts.

Citizens’ Uprising

Business users long settled for spreadsheets and SharePoint, but new “low-code/no-code” tools empower these “citizen creators” with the capability to build professional grade apps on their own. Airport baggage screeners can develop mobile apps that cut down on paperwork, trainers and facilitators put interactive tools in the hands of their students, and analysts and researchers are no longer dependent on developers to “pull data” and create stunning visualizations.

New Ways of Looking at the World (& Your Data)

This isn’t just about business intelligence (BI) and data visualization tools far outpacing anything else that was recently available. It’s not even just about business users’ ability to harness and extend those tools. This is about the ability of tools like Microsoft Power BI to splice together, beautifully visualize, and help users interpret data that their organizations already own — data to which you’ve connected using one of the hundreds of native connectors to third-party services, and data generated every second of every minute of every day from the connected devices that enable the organization’s work.

It’s an exciting time. I’ve explored these trends further, plus how Microsoft’s Power Platform has become the go-to platform for organizations mastering the new landscape in my whitepaper, Microsoft’s Power Platform and the Future of Business Applications. We’re way past CRM. I hope you’ll read it and share your thoughts with me!

Accurately identifying and authenticating users is an essential requirement for any modern application. As modern applications continue to migrate beyond the physical boundaries of the data center and into the cloud, balancing the ability to leverage trusted identity stores with the need for enhanced flexibility to support this migration can be tricky. Additionally, evolving requirements like allowing multiple partners, authenticating across devices, or supporting new identity sources push application teams to embrace modern authentication protocols.

Microsoft states that federated identity is the ability to “Delegate authentication to an external identity provider. This can simplify development, minimize the requirement for user administration, and improve the user experience of the application.”

As organizations expand their user base to allow authentication of multiple users/partners/collaborators in their systems, the need for federated identity is imperative.

The Benefits of Federated Authentication

Federated authentication allows organizations to reliably outsource their authentication mechanism. It helps them focus on actually providing their service instead of spending time and effort on authentication infrastructure. An organization/service that provides authentication to their sub-systems are called Identity Providers. They provide federated identity authentication to the service provider/relying party. By using a common identity provider, relying applications can easily access other applications and web sites using single sign on (SSO).

SSO provides quick accessibility for users to multiple web sites without needing to manage individual passwords. Relying party applications communicate with a service provider, which then communicates with the identity provider to get user claims (claims authentication).

For example, an application registered in Azure Active Directory (AAD) relies on it as the identity provider. Users accessing an application registered in AAD will be prompted for their credentials and upon authentication from AAD, the access tokens are sent to the application. The valid claims token authenticates the user and the application does any further authentication. So here the application doesn’t need to have additional mechanisms for authentication thanks to the federated authentication from AAD. The authentication process can be combined with multi-factor authentication as well.

Glossary

Abbreviation Description
STS Security Token Service
IdP Identity Provider
SP Service Provider
POC Proof of Concept
SAML Security Assertion Markup Language
RP Relying party (same as service provider) that calls the Identity Provider to get tokens
AAD Azure Active Directory
ADDS Active Directory Domain Services
ADFS Active Directory Federation Services
OWIN Open Web Interface for .NET
SSO Single sign on
MFA Multi factor authentication

OpenId Connect/OAuth 2.0 & SAML

SAML and OpenID/OAuth are the two main types of Identity Providers that modern applications implement and consume as a service to authenticate their users. They both provide a framework for implementing SSO/federated authentication. OpenID is an open standard for authentication and combines with OAuth for authorization. SAML is also open standard and provides both authentication and authorization.  OpenID is JSON; OAuth2 can be either JSON or SAML2 whereas SAML is XML based. OpenID/OAuth are best suited for consumer applications like mobile apps, while SAML is preferred for enterprise-wide SSO implementation.

Microsoft Azure Cloud Identity Providers

The Microsoft Azure cloud provides numerous authentication methods for cloud-hosted and “hybrid” on-premises applications. This includes options for either OpenID/OAuth or SAML authentication. Some of the identity solutions are Azure Active Directory (AAD), Azure B2C, Azure B2B, Azure Pass through authentication, Active Directory Federation Service (ADFS), migrate on-premises ADFS applications to Azure, Azure AD Connect with federation and SAML as IdP.

The following third-party identity providers implement the SAML 2.0 standard: Azure Active Directory (AAD), Okta, OneLogin, PingOne, and Shibboleth.

A Deep Dive Implementation

This blog post will walk through an example I recently worked on using federated authentication with the SAML protocol. I was able to dive deep into identity and authentication with an assigned proof of concept (POC) to create a claims-aware application within an ASP.NET Azure Web Application using the federated authentication and SAML protocol. I used OWIN middleware to connect to Identity Provider.

The scope of POC was not to develop an Identity Provider/STS (Security Token Service) but to develop a Service Provider/Relying Party (RP) which sends a SAML request and receives SAML tokens/assertions. The SAML tokens are used by the calling application to authorize the user into the application.

Given the scope, I used stub Identity Provider so that the authentication implementation could be plugged into a production application and communicate with other Enterprise SAML Identity Providers.

The Approach

For an application to be claims aware, it needs to obtain a claim token from an Identity Provider. The claim contained in the token is then used for additional authorization in the application. Claim tokens are issued by an Identity Provider after authenticating the user. The login page for the application (where the user signs in) can be a Service Provider (Relying Party) or just an ASP.NET UI application that communicates with the Service Provider via a separate implementation.

Figure 1: Overall architecture – Identity Provider Implementation

Figure 1: Overall architecture – Identity Provider Implementation

The Implementation

An ASP.NET MVC application was implemented as SAML Service provider with OWIN middleware to initiate the connection with the SAML Identity Provider.

First, the communication is initiated with a SAML request from service provider. The identity provider validates the SAML request, verifies and authenticates the user, and sends back the SAML tokens/assertions. The claims returned to service provider are then sent back to the client application. Finally, the client application can authorize the user after reviewing the claims returned from the SAML identity provider, based on roles or other more refined permissions.

SustainSys is an open-source solution and its SAML2 libraries add SAML2P support to ASP.NET web sites and serve as the SAML2 Service Provider (SP).  For the proof of concept effort, I used a stub SAML identity provider SustainSys Saml2 to test the SAML service provider. SustainSys also has sample implementations of a service provider from stub.

Implementation steps:

  • Start with an ASP.NET MVC application.
  • Add NuGet packages for OWIN middleware and SustainSys SAML2 libraries to the project (Figure 2).
  • Modify the Startup.cs (partial classes) to build the SAML request; set all authentication types such as cookies, default sign-in, and SAMLl2 (Listing 2).
  • In both methods CreateSaml2Options and CreateSPOptions SAML requests are built with both private and public certificates, federation SAML Identity Provider URL, etc.
  • The service provider establishes the connection to identity on start up and is ready to listen to client requests.
  • Cookie authentication is set, default authentication type is “Application,” and set the SAML authentication request by forming the SAML request.
  • When the SAML request options are set, instantiate Identity Provider with its URL and options. Set the Federation to true. Service Provider is instantiated with SAML request options with the SAML identity provider. Upon sign in by the user, OWIN middleware will issue a challenge to the Identity Provider and get the SAML response, claim/assertion back to the service provider.
  • OWIN Middleware issues a challenge to SAML Identity Provider with the callback method (ExternalLoginCallback(…)). Identity provider returns that callback method after authenticating the user (Listing 3).
  • AuthenticateSync will have claims returned from the Identity Provider and the user is authenticated at this point. The application can use the claims to authorize the user to the application.
  • No additional web configuration is needed for SAML Identity Provider communication, but the application config values can be persisted in web.config.

Figure 2: OWIN Middleware NuGet Packages

Figure 2: OWIN Middleware NuGet Packages

Listing 1:  Startup.cs (Partial)

using Microsoft.Owin;
using Owin;

[assembly: OwinStartup(typeof(Claims_MVC_SAML_OWIN_SustainSys.Startup))]

namespace Claims_MVC_SAML_OWIN_SustainSys
{
    public partial class Startup
    {
        public void Configuration(IAppBuilder app)
        {
            ConfigureAuth(app);
        }
    }
}

Listing 2: Startup.cs (Partial)

using Microsoft.Owin;
using Microsoft.Owin.Security;
using Microsoft.Owin.Security.Cookies;
using Owin;
using Sustainsys.Saml2;
using Sustainsys.Saml2.Configuration;
using Sustainsys.Saml2.Metadata;
using Sustainsys.Saml2.Owin;
using Sustainsys.Saml2.WebSso;
using System;
using System.Configuration;
using System.Globalization;
using System.IdentityModel.Metadata;
using System.Security.Cryptography.X509Certificates;
using System.Web.Hosting;

namespace Claims_MVC_SAML_OWIN_SustainSys
{
    public partial class Startup
    {
        public void ConfigureAuth(IAppBuilder app)
        {            
            // Enable Application Sign In Cookie
            var cookieOptions = new CookieAuthenticationOptions
                {
                    LoginPath = new PathString("/Account/Login"),
                AuthenticationType = "Application",
                AuthenticationMode = AuthenticationMode.Passive
            };

            app.UseCookieAuthentication(cookieOptions);

            app.SetDefaultSignInAsAuthenticationType(cookieOptions.AuthenticationType);

            app.UseSaml2Authentication(CreateSaml2Options());
        }

        private static Saml2AuthenticationOptions CreateSaml2Options()
        {
            string samlIdpUrl = ConfigurationManager.AppSettings["SAML_IDP_URL"];
            string x509FileNamePath = ConfigurationManager.AppSettings["x509_File_Path"];

            var spOptions = CreateSPOptions();
            var Saml2Options = new Saml2AuthenticationOptions(false)
            {
                SPOptions = spOptions
            };

            var idp = new IdentityProvider(new EntityId(samlIdpUrl + "Metadata"), spOptions)
            {
                AllowUnsolicitedAuthnResponse = true,
                Binding = Saml2BindingType.HttpRedirect,
                SingleSignOnServiceUrl = new Uri(samlIdpUrl)
            };

            idp.SigningKeys.AddConfiguredKey(
                new X509Certificate2(HostingEnvironment.MapPath(x509FileNamePath)));

            Saml2Options.IdentityProviders.Add(idp);
            new Federation(samlIdpUrl + "Federation", true, Saml2Options);

            return Saml2Options;
        }

        private static SPOptions CreateSPOptions()
        {
            string entityID = ConfigurationManager.AppSettings["Entity_ID"];
            string serviceProviderReturnUrl = ConfigurationManager.AppSettings["ServiceProvider_Return_URL"];
            string pfxFilePath = ConfigurationManager.AppSettings["Private_Key_File_Path"];
            string samlIdpOrgName = ConfigurationManager.AppSettings["SAML_IDP_Org_Name"];
            string samlIdpOrgDisplayName = ConfigurationManager.AppSettings["SAML_IDP_Org_Display_Name"];

            var swedish = CultureInfo.GetCultureInfo("sv-se");
            var organization = new Organization();
            organization.Names.Add(new LocalizedName(samlIdpOrgName, swedish));
            organization.DisplayNames.Add(new LocalizedName(samlIdpOrgDisplayName, swedish));
            organization.Urls.Add(new LocalizedUri(new Uri("http://www.Sustainsys.se"), swedish));

            var spOptions = new SPOptions
            {
                EntityId = new EntityId(entityID),
                ReturnUrl = new Uri(serviceProviderReturnUrl),
                Organization = organization
            };
        
            var attributeConsumingService = new AttributeConsumingService("Saml2")
            {
                IsDefault = true,
            };

            attributeConsumingService.RequestedAttributes.Add(
                new RequestedAttribute("urn:someName")
                {
                    FriendlyName = "Some Name",
                    IsRequired = true,
                    NameFormat = RequestedAttribute.AttributeNameFormatUri
                });

            attributeConsumingService.RequestedAttributes.Add(
                new RequestedAttribute("Minimal"));

            spOptions.AttributeConsumingServices.Add(attributeConsumingService);

            spOptions.ServiceCertificates.Add(new X509Certificate2(
                AppDomain.CurrentDomain.SetupInformation.ApplicationBase + pfxFilePath));

            return spOptions;
        }
    }
}

Listing 3: AccountController.cs

using Claims_MVC_SAML_OWIN_SustainSys.Models;
using Microsoft.Owin.Security;
using System.Security.Claims;
using System.Text;
using System.Web;
using System.Web.Mvc;

namespace Claims_MVC_SAML_OWIN_SustainSys.Controllers
{
    [Authorize]
    public class AccountController : Controller
    {
        public AccountController()
        {
        }

        [AllowAnonymous]
        public ActionResult Login(string returnUrl)
        {
            ViewBag.ReturnUrl = returnUrl;
            return View();
        }

        //
        // POST: /Account/ExternalLogin
        [HttpPost]
        [AllowAnonymous]
        [ValidateAntiForgeryToken]
        public ActionResult ExternalLogin(string provider, string returnUrl)
        {
            // Request a redirect to the external login provider
            return new ChallengeResult(provider, Url.Action("ExternalLoginCallback", "Account", new { ReturnUrl = returnUrl }));
        }

        // GET: /Account/ExternalLoginCallback
        [AllowAnonymous]
        public ActionResult ExternalLoginCallback(string returnUrl)
        {
            var loginInfo = AuthenticationManager.AuthenticateAsync("Application").Result;
            if (loginInfo == null)
            {
                return RedirectToAction("/Login");
            }

            //Loop through to get claims for logged in user
            StringBuilder sb = new StringBuilder();
            foreach (Claim cl in loginInfo.Identity.Claims)
            {
                sb.AppendLine("Issuer: " + cl.Issuer);
                sb.AppendLine("Subject: " + cl.Subject.Name);
                sb.AppendLine("Type: " + cl.Type);
                sb.AppendLine("Value: " + cl.Value);
                sb.AppendLine();
            }
            ViewBag.CurrentUserClaims = sb.ToString();
            
            //ASP.NET ClaimsPrincipal is empty as Identity returned from AuthenticateAsync should be cast to IPrincipal
            //var identity = (ClaimsPrincipal)Thread.CurrentPrincipal;
            //var claims = identity.Claims;
            //string nameClaimValue = User.Identity.Name;
            //IEnumerable<Claim> claimss = ClaimsPrincipal.Current.Claims;
          
            return View("Login", new ExternalLoginConfirmationViewModel { Email = loginInfo.Identity.Name });
        }

        // Used for XSRF protection when adding external logins
        private const string XsrfKey = "XsrfId";

        private IAuthenticationManager AuthenticationManager
        {
            get
            {
                return HttpContext.GetOwinContext().Authentication;
            }
        }
        internal class ChallengeResult : HttpUnauthorizedResult
        {
            public ChallengeResult(string provider, string redirectUri)
                : this(provider, redirectUri, null)
            {
            }

            public ChallengeResult(string provider, string redirectUri, string userId)
            {
                LoginProvider = provider;
                RedirectUri = redirectUri;
                UserId = userId;
            }

            public string LoginProvider { get; set; }
            public string RedirectUri { get; set; }
            public string UserId { get; set; }

            public override void ExecuteResult(ControllerContext context)
            {
                var properties = new AuthenticationProperties { RedirectUri = RedirectUri };
                if (UserId != null)
                {
                    properties.Dictionary[XsrfKey] = UserId;
                }
                context.HttpContext.GetOwinContext().Authentication.Challenge(properties, LoginProvider);
            }
        }
    }
}

Listing 4: Web.Config

<?xml version="1.0" encoding="utf-8"?>
<!--
  For more information on how to configure your ASP.NET application, please visit
  https://go.microsoft.com/fwlink/?LinkId=301880
  -->
<configuration>
  <appSettings>
    <add key="webpages:Version" value="3.0.0.0" />
    <add key="webpages:Enabled" value="false" />
    <add key="ClientValidationEnabled" value="true" />
    <add key="UnobtrusiveJavaScriptEnabled" value="true" />
    <add key="SAML_IDP_URL" value="http://localhost:52071/" />
    <add key="x509_File_Path" value="~/App_Data/stubidp.sustainsys.com.cer"/>
    <add key="Private_Key_File_Path" value="/App_Data/Sustainsys.Saml2.Tests.pfx"/>
    <add key="Entity_ID" value="http://localhost:57234/Saml2"/>
    <add key="ServiceProvider_Return_URL" value="http://localhost:57234/Account/ExternalLoginCallback"/>
    <add key="SAML_IDP_Org_Name" value="Sustainsys"/>
    <add key="SAML_IDP_Org_Display_Name" value="Sustainsys AB"/>
  </appSettings>

Claims returned from the identity provider to service provider:

Claims returned from the identity provider to service provider

Additional References

Azure Web Apps Background

I’ve been working with Azure Web Apps for a long time. Before the launch of Azure Web Apps for Containers (or even Azure Web App on Linux), these web apps ran on Windows Virtual Machines managed by Microsoft. This meant that any workload running behind IIS (i.e., ASP.Net) would run without hiccups — but that was not the case with workloads which preferred Linux over Windows (i.e., Drupal).

Furthermore, the Azure Web Apps that ran on Windows were not customizable. This meant that if your website required a custom tool to work properly, chances are it was not going to work on an Azure Web App, and you’d need to deploy a full-blown IaaS Virtual Machine. There was also a strict lockdown regarding tools and language runtime versions that you couldn’t change. So, if you wanted the latest bleeding-edge language runtime, you weren’t gonna get it.

Azure Web Apps for Containers: Drum Roll

Last year, Microsoft released the Azure Web Apps for Containers or Linux App Service plan offering to the public. This meant we could build a custom Docker image containing all the binaries and files, and then deploy it on the PaaS offering. After working with the product for some time, I was like..

The product was excellent, and it was clear that it had potential. Some of  the benefits:

  • Ability to use a custom Docker image to run the Web App
  • Zero headaches from managing Docker containers
  • The benefits of Azure Web App on Windows like Backups, Kudu, Deployment Slots, Autoscaling (Scale up & Scale out), etc.

Suddenly, running workloads that preferred Linux or required custom binaries became extremely easy.

The Architecture

Compared to Azure Web App on Windows, the architecture implemented in Azure Web App for Containers is different.

diagram of Azure web apps architecture

Each of the above Web Apps is strictly locked down with minimal possibility of modification. Furthermore, the backend storage was based on Network File Shares which means that even if you don’t want any storage (like in cases when your app simply reads data from the database and displays it back), the app would still perform slowly.

diagram of Azure web apps architecture

The major difference is that the Kudu/SCM site runs in a separate container from the actual web app. Both containers are connected to each other with a private network. In this case, each App Service Plan is deployed on a separate Virtual Machine and all the plumbing is managed by Microsoft. The benefits of this approach are:

  • Better isolation. If your Kudu is experiencing issues, it reduces the chance of taking down your actual website.
  • Ability to customize the actual web app container running the website.
  • Better resource utilization

Stay tuned for the next part in which I would be discussing the various options related to Storage which are available in Azure Web App for Containers and their trade-offs.

Happy holidays!

Did you know you can build an intelligent twitter bot and run it for just pennies a month using Azure’s Logic and Function apps, coupled with Microsoft’s Language Understanding Intelligence Service (LUIS)? LUIS can “read” a tweet and determine the tweet’s sentiment with a little help from you. Run selected tweets through your LUIS app, determine their meaning, and then use that meaning to create a personalized tweet back at the original.

Here’s how…

Step One: Select a Twitter Query

Use Twitter’s advanced search tools to craft a query to narrow down your selection of tweets to the specific messages you want your bot to respond to. Your Azure charges will be usage-based, so you want this query to be specific enough to only pick up the kinds of messages your LUIS app will know how to respond to.

Step Two: Create an App with LUIS

If you don’t already have a LUIS app to use, follow the steps here to create your new LUIS app. For your utterances, I recommend using a sampling of tweets that were returned using the twitter query you created. Copy as many tweets from your query as possible into the LUIS test tool and assign them to the correct intent as needed. Train and publish your app before continuing.

Step Three: Create a Function App

Use the steps here to create a new Function App with a HTTP trigger.

Once you have the app and trigger created, download the function by clicking “Download app content.”

Screenshot with Download App content highlighted

Unzip your app and open it in Visual Studio. Add classes for the LUIS Prediction:

public class Prediction
    {
        [JsonProperty(PropertyName = "query")]
        public string Query { get; set; }

        [JsonProperty(PropertyName = "topScoringIntent")]
        public Intent TopScoringIntent { get; set; }

        [JsonProperty(PropertyName = "intents")]
        public List Intents { get; set; }

        [JsonProperty(PropertyName = "entities")]
        public List Entities { get; set; }

        [JsonProperty(PropertyName = "luisPrediction")]
        public string LuisPrediction { get; set; }

        [JsonProperty(PropertyName = "desiredIntent")]
        public string DesiredIntent { get; set; }

        [JsonProperty(PropertyName = "isDesiredIntent")]
        public bool IsDesiredIntent { get; set; }
    }

public class Intent
    {
        [JsonProperty(PropertyName = "intent")]
        public string IntentValue { get; set; }

        [JsonProperty(PropertyName = "score")]
        public decimal Score { get; set; }
    }

   public class Entity
    {
        [JsonProperty(PropertyName = "entity")]
        public string EntityValue { get; set; }

        [JsonProperty(PropertyName = "type")]
        public string Type { get; set; }

        [JsonProperty(PropertyName = "startIndex")]
        public int StartIndex { get; set; }

        [JsonProperty(PropertyName = "endIndex")]
        public int EndIndex { get; set; }

        [JsonProperty(PropertyName = "score")]
        public decimal Score { get; set; }
    }

Then Modify your HTTPTrigger to parse the prediction:

[FunctionName("HttpTrigger")]
        public static async Task Run([HttpTrigger(AuthorizationLevel.Anonymous, "post", Route = null)]HttpRequestMessage req, TraceWriter log)
        {
            log.Info("C# HTTP trigger function processed a request.");

            dynamic data = await req.Content.ReadAsAsync          
            var prediction = ((JObject)data).ToObject();

            var message = GetTweetMessage(prediction);

            if (!string.IsNullOrEmpty(message))
            {
                return req.CreateResponse(HttpStatusCode.OK, message);
            }

            return req.CreateResponse(HttpStatusCode.NotFound);
        }

Replace “GetTweetMessage” with your own code to interpret the intent and entities (if defined/provided) and generate your tweet message. Then send the message string back in the response. Deploy your changes back to Azure. (Right click project in visual studio, select “Publish”, follow instructions)

Note: In order to use a free dev service plan for your function, you must turn its AlwaysOn setting to Off. You can only do this if you are using a HTTP trigger; a timer trigger won’t fire if you turn off AlwaysOn.

Do this by going to Application settings:

Screenshot with Application Settings highlighted
Toggle AlwaysOn and save the changes. You may now go to Platform features:

Screenshot with Platform features highlighted.

Then All Settings:

All Settings is highlighted.

Then scroll down to App Service Plan and choose Change App Service Plan:

All settings is highlighted.

Change the app service plan to your devtest (free) service plan.

Step Four: Create a Logic App

General information on creating new logic apps can be found here.

Once you’ve created your logic app, go to the Logic app designer:

Logic app designer is highlighted.
Create your first workflow item: a Twitter search tweets trigger. Use your search query from above and change the interval as needed:

Screenshot of query

Create your next workflow item by clicking the plus button at the bottom of your twitter search tweets trigger. Add a new LUIS get prediction action. (You will be prompted for your LUIS connection and key; you can find these in your LUIS app.) The connection value is your LUIS endpoint. Select your LUIS-connected app for the APP Id and then click on Utterance Text field. A flyout list of dynamic options will appear; choose Tweet text under the Twitter options. Leave Desired Intent blank.

Screenshot of Get Prediction input

Add a new flow item under Control -> Condition:

Screenshot of new flow item input.

This workflow checks for the Top Scoring Intent Name from LUIS. We don’t want to continue passing this message to our Azure function if LUIS did not recognize its intent, so we only continue if Top Scoring Intent is not equal to None.

The control flow added two boxes below it. One for If True, the other for If False. Leave If False blank, the workflow will stop here if LUIS has not returned a usable intent. In the If True box, add a new action for Azure Functions and select the function you created above.

In the Request Body field of your function trigger, put the LUIS Body parameter. Then add another Twitter action to Post a tweet. Use the Function’s body to post the resulting message. Include a link back to the original tweet to make the tweet appear as a quoted retweet:

Screenshot of If True input

Your overall logic should look something like this: (You can see this bot in action at @LeksBot.)

Twitter bot logic is pictured.

Step Five: Train & Improve Your Bot

Open your logic app and scroll down to Runs history. You can see each time your bot has triggered. If you see tweets that weren’t responded to properly, you can open up each run and inspect the flow. You can see the run’s parameters and make adjustments. Paste the tweet into your LUIS app and train it on the correct intent. Each time you do this your app will become “smarter” and make fewer mistakes.

After you have re-trained LUIS, (make sure you click Publish!), or made any adjustments to your flow, you can resubmit the same run (tweet) and make sure it’s processed correctly. Re-train and adjust as needed to improve your bot’s experience.

Best of Texas AwardTwo years ago, the Texas Workforce Commission (TWC) came to AIS with an outdated online budgeting tool called the “Texas Reality Check” for middle and high school students. The application, designed to give students a clear sense of how much their desired future lifestyle will cost and what education and career choices will support it, was plagued by performance and accessibility issues…and its young target demographic was simply tuning it out.

AIS modernized the site for teen sensibilities, streamlined the underlying information architecture for easier use, overhauled the content strategy and user experience, and made it fully compliant with the latest accessibility guidelines.  You can read more about our work on this project here.

The new and improved Texas Reality Check has since gone on to become the most popular application of the Labor Market and Career Information Department (LMCI) of the Texas Workforce Commission. And now it’s been honored with a 2018 “Best of Texas” Award for Best Application Serving the Public. The awards highlight the Texas state government’s top creative tech implementations of the year, for both internal improvements and public-facing services like TRC.

“Governmental and educational leaders in Texas are leveraging technology to improve cybersecurity, enhance citizen service and advance emergency response, among many other things,” said Teri Takai, executive director of the Center for Digital Government. “Congratulations to this year’s Best of Texas winners for the vital role they are playing in advancing information technology in Texas.”

We’re really proud of our work on this project and thrilled that school students all across Texas have responded to the site in such a positive and engaged way. We hope the application continues to inspire them to dream big…while also equipping them with the knowledge and tools they need to achieve their goals.

I am pleased to announce my latest Pluralsight course on PowerApps (Well…such is the nature of change in the cloud that there has already been a name change since I submitted this course for publication, only a few weeks back. The aspect of PowerApps covered in my course is now referred to as Canvas Apps.)

This course is designed for developers (both citizen and professional developers) interested in a low-code approach for building mobile applications.

Here’s some background on PowerApps, if you haven’t had a chance to play with it yet:

PowerApps is a productive low-code development platform. It allows you to very quickly build business applications that can run inside a web browser, on a phone or a tablet. PowerApps includes a web-based IDE (PowerApps Studio, a set of built-in cross-platform controls), an Excel-like expression language that also includes imperative constructs like variables and loops, and over 130 connectors to talk to any number of data sources — including SQL Server, Office 365, Salesforce, Twitter, etc. You can also use custom connectors to talk to your domain-specific data source.

Beyond the controls, language expression and connectors, PowerApps provides ALM support in the form of app versioning, app publication to various app stores, swim-lanes for development environments, authentication and authorization (via Azure AD), RBAC controls, and security polices like data loss prevention (DLP).  All in all, the PowerApps service seeks to significantly lower the bar for building and distributing cross-platform mobile applications within your enterprise.

For a concrete example of our use of PowerApps, please read how we built a cross-platform event app in less than a week. Also please check out a recent episode of DotNetRocks where we talk about PowerApps.

Finally, as part of the latest spring update, PowerApps is combining with Dynamics 365 for Sales, Marketing, and Talent applications to offer an enterprise high-productivity application platform as a service (known as Microsoft Business Applications platform). What this means for PowerApps developers is that:

  1. They can now take advantage of server-side logic
  2. They have access to a data-centric way of building declarative apps, known as model-driven apps (in contrast to canvas apps, which are built by dragging and dropping controls to a canvas).

For more information on the spring update, please refer to this blog post by Frank Weigel.

I hope you will find this course useful. Please reach out to me via this blog or Twitter if you have any questions or comments.

If you need managed services to maintain peak IT network operations, consider us here at Applied Information Sciences. We’ll manage all your IT services for a predictable cost so you can focus on more strategic investments. AIS’ Managed Services Practice provides ongoing responsibility for monitoring, patching and problem resolution for specific IT systems on your company’s behalf.

Capabilities

  • Patching
  • Monitoring
  • Alerting
  • Backup and Restore
  • Incident Response

AIS’ Managed Service Practice has up to 24×7 coverage for initial responses to incidents through a combination of dedicated, part- and full-time staff, both onshore and offshore. AIS prides itself in being on the leading edge of managed services support. Our collaborative, disciplined approach is committed to quality, value, time and budget. Read More…

 

AIS recently completed work on a complete revamp of the Texas Workforce Commission’s “Texas Reality Check” website. Texas Reality Check is an Internet-available, fully accessible, responsive, mobile-first and browser-agnostic design. This website was tested for accessibility, performance, vulnerability scans, and usability.

Background

Texas Reality Check (TRC) is targeted at students on a statewide basis, ranging from middle school to high school (with some colleges and universities making use of the tool for “life skills” classes). The goal is to inspire students to think about occupations, and prepare for educational requirements so they can achieve the income level that meets their lifestyle expectations.

This tool walks students through different areas of life, on a step-by step-basis, identifying budgets associated with living essentials such as housing, transportation, food, clothing, etc. Students make selections and then calculate a corresponding monthly income that would afford the selections they make. From here, the students are directed to another page and connected to a database on careers and associated salaries.

However, the existing site was dated and in need of improvements in three core areas: UX, Accessibility, and overall performance. Here’s how AIS delivered:

Read More…

AIS recently worked with the General Services Administration (GSA) Technology Transformation Services Division, better known as 18F.  The engagement involved working with 18F to digitize the Department of Labor’s Section 14(c) certification application process (part of the Fair Labor Standards Act). This is currently a paper-based process that 18F hoped to modernize as an intuitive, online application…and to do it using agile methodologies.

AIS was tasked with building the first version of the digital form within a 60-day period of performance – much shorter than typical federal contracts.  AIS pulled together a multi-disciplinary team comprised of user researchers, designers, and front- and back-end web developers to work closely with 18F and the Department of Labor (DOL) Product Owner. The team built the entire form with complex validation along with a registration and login and an administrative section to process the form applications. They performed multiple usability tests with actual end users, and followed 18F’s principles of working in the open using a public GitHub repository. All User Stories and discussion threads were thoroughly documented in that repository’s issues list.

AIS was able to work together with many divisions inside DOL to make this happen.  We addressed security concerns by the Chief Information Security Officer (CISO) and worked with the CIO office to coordinate delivery of the application and a testing and staging environment for deployment. We also set up a Continuous Integration/Continuous Deployment process so that multiple DOL stakeholders could stay abreast of what was happening and exercise the existing application state.  We were even able to address legal concerns with testing by external citizens by getting signed consent forms for testing and recording the sessions.

The collaboration was so successful that our client wrote their own blog post on the project, detailing exactly “how government and private industry can work together using agile methodologies to produce great results.” You can read it here. 

These types of successful, agile engagements break down the myths that software development for the government needs to take months (or even years). Government can and will move faster, and after every small win like this project, the traditional methods of building software and procuring software development are changing across the industry.  This bodes well not just for the citizens who need to interact with these digital services… but also for saving our tax dollars.

These disciplines can play a significant role in building stable release processes that help ensure project milestones are met.

Continuous Integration (CI) and Continuous Delivery (DC) are rapidly becoming an integral part of software development. These disciplines can play a significant role in building stable release processes that help ensure project milestones are met. And in addition to simply performing compilation tasks, CI systems can be extended to execute unit testing, functional testing, UI testing, and many other tasks. This walkthrough demonstrates the creation of a simple CI/CD deployment pipeline with an integrated unit test.

There are many ways of implementing CI/CD, but for this blog, I will use Jenkins and GiHub to deploy the simple CI/CD pipeline. A Docker container will be used to host the application.  The GitHub repository hosts the application including a Dockerfile for creating an application node. Jenkins is configured with GitHub and Docker Plugin. Read More…