AIS is the 2019 MSUS Partner Award Winner – Business Applications – Dynamics 365 for Sales. This is our vision for the Power Platform era.

I am incredibly excited to share that AIS has been announced as the 2019 MSUS Partner Award Winner – Business Applications – Dynamics 365 for Sales at #MSInspire!

Some background on how we won:

Story for MSUS Win Dynamics 365 SalesWhen the National Football League Players Association (NFLPA) needed to score a big win for its members, they brought in the AIS team to build a single, shared player management system, called PA.NET. AIS extensively customized Dynamics 365 for Sales to meet the unique needs of NFLPA, integrated it with Office 365… and then took it all to the cloud with Microsoft Azure.

Using Dynamics 365 for Sales, PA.NET provides one master set of player data and powerful reporting tools. Now employees across the organization can turn to the same system to answer questions, uncover marketing and licensing opportunities, and identify other ways to help members. When a specific licensing request comes in, they can find the right person, or people, in minutes.

So where do we go from here? From Dynamics to Power Platform.

Our Business Applications & Automation Practice is investing heavily in Dynamics and the Power Platform. We recognize that an organization’s adoption of the Power Platform should be thought of as a journey, not a one-off “app of the moment” solution. By focusing on enterprise management and leveraging the Common Data Service (CDS) as much as possible, we help clients like NFLPA scale their adoption as they migrate workloads and make use of PowerApps, Power BI, Flow, and Dynamics 365.

Power Platform Technologies

Earlier this year, we worked with friends in the business applications community around the world to launch our Power Platform Adoption Framework. Mature organizations realize that rigor, discipline, and best practices are needed to adopt the platform at scale.

The Power Platform Adoption Framework is the start-to-finish approach for adopting the platform at scale.

It helps enterprise organizations:

  • Get to value quickly
  • Educate, train, and grow their community of developers and power users
  • Create durable partnerships between business, IT, and the user community
  • Continuously improve ROI on the platform by identifying and migrating new workloads
  • Blend agile, rapid app development with rigorous, disciplined enterprise management

I hope that the framework will continue to become a worldwide standard for enterprise-grade adoption of the Power Platform. I’ve been lucky to collaborate with Power Platform experts and users around the world to create the Power Platform Adoption Framework. I’m proud to say that AIS is fully behind the framework, sharing it with the community, and committed to its future development as best practices for scaled adoption evolve. We’re sharing it so that everyone can use it because we believe that a vibrant and thriving community around this technology is good for everyone who uses it.

Please join me in congratulating the AIS team, and please join us on this journey to scale the Power Platform to meet the challenges of the years to come.

To benefit the most from this post, you should understand the following concepts:

  • Dynamics CRM Workflows
  • DocuSign Integration with Dynamics CRM
  • DocuSign merge fields and merge-back

The Problem

I recently experimented with integrating DocuSign with Dynamics 365 — specifically, the merging of data into a DocuSign form and then writing the data back into Dynamics. After reading the DocuSign for Dynamics 365 CRM – 6.1 documentation I found that DocuSign Drop Downs and Radio Button controls are not supported for Dynamics merging and write backs. I started work on a solution that would use a Checkbox field in DocuSign and a Two Options field in Dynamics. I had all my text fields working correctly and assumed it would be straightforward as there were both Boolean fields.

I was disappointed to find out that the solution would not merge. After researching online and trying a few suggestions, I finally decided to add a temporary text field to my DocuSign form and see what Dynamics was putting into it, and found that the value was “Yes.” Then I looked at the form data in DocuSign…and it had the value “X.” I tried replacing the values for “Yes” and “No” in the Dynamics Two Options field with “X” and “O”, but that didn’t work either.

The Solution

I finally decided to change the “Yes” and “No” values to “true” and “false.”

This time, when the data was merged, the checkbox was checked!

And once the client receives the email, fills out the form, and the .pdf files are sent…this is when the ‘X’ we saw in the form data is used:

Finally, I verified it worked end-to-end by unchecking the box in Dynamics and saving the record:

After firing off the workflow to merge data in DocuSign form, the box is unchecked now:

Send the email off to be filled, check the box and add a new allergy:

Now, wait for the Dynamics envelope status workflow to complete. Check the record and the record will update successfully in Dynamics.

Conclusion

Albeit small, I’m surprised I didn’t find it documented. So if you’ve come across this issue working with DocuSign Checkbox fields and Dynamics 365, I hope this post saves you some time!

PowerApps logoI’m working on a project with a straightforward requirement that’s typically solved with out-of-the-box PowerApps platform features. The customer needs to capture employment application data as part of their online hiring process. In addition to entering standard employment data, applicants enter a typical month’s worth of budgeted expenses to paint a clear financial picture which serves as one factor in their suitability for employment.

The Requirements

The solution consists of an Application entity and a related Expense entity to track multiple expenses per Application. The Application entity needs to capture aggregate expense values based on expense categories. The Expense entity stores the institution or business receiving the payment, the Expense Type (lookup), the monthly payment in dollars, and the balance owed in dollars. Expense Types are associated with “Expense Categories.” An Expense Category is associated with one or more Expense Types. The Application entity tracks 15-20 aggregate expense values that need to be calculated from the individual expense entries.

For example, the expense types “Life Insurance” and “Auto Insurance” are both associated with the expense category “Insurance.” The Application entity has an aggregate field called “Total Insurance Expenses” to capture the sum of all monthly insurance-related expenses.

The basic entity model is shown below:

Diagram of entity model

To summarize all that detail (which is probably hard to follow in paragraph form), the essential requirements are:

  • Capture standard employment data for applicants
  • Capture monthly budgeted expenses associated with an expense type and category
  • Calculate 15-20 aggregate expenses based on expense category
  • The solution must accommodate the addition of new expense categories

The aggregate fields on the Application entity fall into one of four categories: 1) a monthly payment total by expense category, 2) an outstanding balance total by expense category, 3) a monthly payment total across all categories, and 4) a total outstanding balance across all categories.

The breakdown for each of the expense categories and their associated aggregations on the Application entity can be depicted as such:

Expense Category Category Payment Rollup Category Balance Rollup All Payments Rollup All Balances Rollup
Automobile Total Automobile Expenses Total Auto Balances Owed Total Monthly Expenses Total Outstanding Debt
Credit Card Total Credit Card Expenses Total Credit Card Balances Owed Total Monthly Expenses Total Outstanding Debt
Food/Clothing Total Food/Clothing Expenses Total Monthly Expenses
Housing Total Housing Expenses Total Housing Balances Owed Total Monthly Expenses Total Outstanding Debt
Insurance Total Insurance Expenses Total Monthly Expenses
Medical/Dental Total Medical Expenses Total Monthly Expenses
Other Debt Total Other Debt Expenses Total Other Debt Balances Owed Total Monthly Expenses Total Outstanding Debt
Utilities Total Utility Expenses Total Monthly Expenses

The table shows that some expense categories require a total of four aggregate calculations, whereas others only require two aggregate calculations. The calculations should occur when an Expense is created, the values for a monthly payment or balance owed change, an Expense is deactivated/activated (state change), or when an Expense is deleted. “Total Monthly Expenses” is calculated for all expense entries. Only four categories require the category balance and total outstanding balance calculations.

Platform Limitations

Maximum Number of Rollup Fields

Dynamics 365 only allows for a maximum of 10 rollup fields on any given entity — these are fields that take advantage of the “Rollup” field type customized in the solution, the values for which are automatically calculated by the platform on a predetermined interval, e.g., every 12 hours.

One option to overcome this limitation — only available in on-premises Dynamics 365 implementations — is to modify the maximum number of rollup fields per entity in the MSCRM_CONFIG database. There are rare circumstances wherein modifying table values in this database are beneficial. However, given the possibility of a disaster recovery situation in which Dynamics 365 needs to be reinstalled and/or recovered, any modifications made to the MSCRM_CONFIG database could be lost. Even if an organization has well-documented disaster recovery plans that account for these modifications, there’s always a chance the documented procedures will not be followed, or steps possibly skipped.

Another consideration is the potential to move to the cloud. If the customer intends to move their Dynamics 365 application to the cloud, they’ll want to ensure their solution remains on a supported path, and eliminate the need to re-engineer the solution if that day comes.

Rollup Calculation Filters

Rollup fields in Dynamics 365 are indeed a powerful feature, but they do come with limitations that prevent their use in complex circumstances. Rollup fields only support simple filters when defining the rollup field aggregation criteria.

To keep this in the context of our requirements, note above that the Expense Type and Expense Category are lookup values in our solution. If we need to calculate the sum of all credit card expenses entered by an applicant, this is not possible given our current design, because the Expense Type is a lookup value on the expense entry. You’ll notice that when I try to use the Expense Type field in the filter criteria for the rollup field, I’m only given the choices “Does not contain data” and “Contains data.” Not only can’t I use actual values of the Expense Type, but I can’t drill down to the related Expense Category to include it in my aggregation filter.

Screenshot of test rollup screen

Alternatives

The limitations above could be overcome by redesigning our solution, for example, by choosing to configure both the Expense Type and Expense Category fields as Option Sets instead of lookups, along with some sophisticated Business Rules that appropriately set the Expense Category based on the selected Expense Type. That’s one option worth considering, depending on the business requirements with which you’re dealing. We could also choose to develop a code-heavy solution by writing plugin code to do all these calculations, thus side-stepping the limitation on rollup fields and accommodating the entity model I’ve described.

The Solution

Ultimately, however, the customer wants a solution that allows them to update their expense tracking requirements without needing developers to get the job done. For example, the organization may decide they no longer want to track a certain expense category or may want to add a new one. Choosing to create entities to store the necessary lookup values will afford them that kind of flexibility. However, that still leaves us the challenge of calculating the aggregate expense values on the Application entity.

The final solution may still require the involvement of their IT department for some of the configuration steps but ideally will not require code modifications.

Lookup Configuration

The first step toward our solution is to add four additional fields to the Expense Category entity. These four fields represent the four aggregation categories described above: 1) Category Payment Rollup, 2) Category Balance Rollup, 3) Payments Rollup (for all categories), and 4) Balances Rollup (for all categories).

These fields will allow users to define how aggregate values are calculated for each Expense Category, i.e., by identifying the target fields on the Application entity for each aggregation.

Screenshot of Expense Category in Dynmaics 365

Custom Workflow Activity

The next step is to write a custom workflow activity to perform the aggregate calculations described above. Custom workflow activities present several benefits to the customer, primarily centered on the ease of configuration within the Dynamics 365 UI itself  (run asynchronously, run on-demand, and/or when specific events occur on the target record type, e.g., create, update, delete, or state change). Custom workflow activities can accept user-defined parameters configured in the workflow definition.

This means that — as you might have guessed — the custom workflow activity can be written in such a way to allow users to add new Expense Categories so that the aggregate calculations “just work” without requiring code modifications or changes to the workflow configuration in the solution.

Here’s the custom workflow activity class that runs the calculations followed by the workflow definition. As you can see below, I’ve included the Application and Expense Category fields as required input parameters for the workflow activity. (I’ll likely refactor this solution to accept the four fields as inputs, but for now, this gets the job done. Thanks to my good friend, Matt Noel, for that suggestion.) Further down, you’ll notice that for each aggregate field, we run a custom fetch query configured appropriately to perform the required calculation.

using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Sdk.Client;
using Microsoft.Xrm.Sdk.Query;
using Microsoft.Xrm.Sdk.Workflow;
using System;
using System.Activities;

namespace Project.Workflow.Activities
{
    public class CalculateExpenses : CodeActivity
    {
        #region Input Parameters

        [RequiredArgument]
        [Input("Application")]
        [ReferenceTarget("new_application")]
        public InArgument<EntityReference> Application { get; set; }

        [RequiredArgument]
        [Input("Expense Category")]
        [ReferenceTarget("new_expensecategory")]
        public InArgument<EntityReference> ExpenseCategory { get; set; }

        private readonly string _categoryPaymentRollupField = "new_categorypaymentrollup";
        private readonly string _allPaymentsRollupField = "new_paymentsrollup";
        private readonly string _categoryBalanceRollupField = "new_categorybalancerollup";
        private readonly string _allBalancesRollupField = "new_balancesrollup";

        #endregion

        protected override void Execute(CodeActivityContext executionContext)
        {
            var tracer = executionContext.GetExtension<ITracingService>();
            var context = executionContext.GetExtension<IWorkflowContext>();
            var serviceFactory = executionContext.GetExtension<IOrganizationServiceFactory>();
            var orgService = serviceFactory.CreateOrganizationService(null);
            var orgContext = new OrganizationServiceContext(orgService);

            string[] _rollupFields =
            {
                _categoryPaymentRollupField,
                _allPaymentsRollupField,
                _categoryBalanceRollupField,
                _allBalancesRollupField
            };

            var expenseCategory = GetEntity(orgService, ExpenseCategory.Get(executionContext), _rollupFields);
            var applicationRef = Application.Get(executionContext);
            var application = new Entity("new_application")
            {
                Id = applicationRef.Id
            };

            // Set the Category Payment Rollup
            if (expenseCategory.GetAttributeValue<string>(_categoryPaymentRollupField) != null)
            {
                var paymentRollup = expenseCategory.GetAttributeValue<string>(_categoryPaymentRollupField);
                application[paymentRollup] = new Money(GetExpenseAggregate(orgService, application.Id, expenseCategory.Id, false, true));
            }

            // Set the rollup for all Monthly Payments
            if (expenseCategory.GetAttributeValue<string>(_allPaymentsRollupField) != null)
            {
                var allPaymentsRollup = expenseCategory.GetAttributeValue<string>(_allPaymentsRollupField);
                application[allPaymentsRollup] = new Money(GetExpenseAggregate(orgService, application.Id, expenseCategory.Id, false, false));
            }

            // Set the rollup for Category Balances
            if (expenseCategory.GetAttributeValue<string>(_categoryBalanceRollupField) != null)
            {
                var categoryBalanceRollup = expenseCategory.GetAttributeValue<string>(_categoryBalanceRollupField);
                application[categoryBalanceRollup] = new Money(GetExpenseAggregate(orgService, application.Id, expenseCategory.Id, true, true));
            }

            // Set the rollup for all Category Balances 
            if (expenseCategory.GetAttributeValue<string>(_allBalancesRollupField) != null)
            {
                var allBalancesRollup = expenseCategory.GetAttributeValue<string>(_allBalancesRollupField);
                application[allBalancesRollup] = new Money(GetExpenseAggregate(orgService, application.Id, expenseCategory.Id, true, false));
            }

            // Execute the update on the Application rollup fields 
            try
            {
                //trace 
                orgService.Update(application);
            }
            catch (Exception e)
            {
                //trace
                throw new InvalidPluginExecutionException("Error updating Application: " + e.Message);
            }
        }

        private static Entity GetEntity(IOrganizationService service, EntityReference e, params String[] fields)
        {
            return service.Retrieve(e.LogicalName, e.Id, new ColumnSet(fields));
        }

        private decimal GetExpenseAggregate(IOrganizationService service, Guid applicationId, Guid expenseCategoryId, bool balanceRollup, bool includeCategory)
        {
            var sum = 0m;

            var aggregateField = balanceRollup ? "new_balanceowed" : "new_monthlypayment";

            var fetchXML = @"<fetch distinct='false' mapping='logical' aggregate='true' >" +
                  "<entity name='new_expense' >" +
                    "<attribute name='" + aggregateField + "' aggregate='sum' alias='sum' />" +
                    "<filter type='and' >" +
                      "<condition attribute='statecode' operator='eq' value='0' />" +
                      "<condition attribute='new_applicationid' operator='eq' value='" + applicationId + "' />" +
                    "</filter>";

            var endFetch = "</entity></fetch>";

            if (includeCategory)
            {
                var categoryFetch = @"<link-entity name='new_expensetype' from='new_expensetypeid' to='new_expensetypeid' link-type='inner' alias='ExpenseType' >" +
                      "<link-entity name='new_expensecategory' from='new_expensecategoryid' to='new_categoryid' link-type='inner' alias='ExpenseCategory' >" +
                        "<filter type='and' >" +
                          "<condition attribute='new_expensecategoryid' operator='eq' value='" + expenseCategoryId + "' />" +
                        "</filter>" +
                      "</link-entity>" +
                    "</link-entity>";

                fetchXML += categoryFetch + endFetch;
            }
            else
            {
                fetchXML += endFetch;
            }

            FetchExpression fetch = new FetchExpression(fetchXML);

            try
            {
                EntityCollection aggregate = service.RetrieveMultiple(fetch);

                foreach (var c in aggregate.Entities)
                {
                    if (((AliasedValue)c["sum"]).Value is Money)
                    {
                        sum = ((Money)((AliasedValue)c["sum"]).Value).Value;
                        //tracer.Trace("Sum of payments is: {0}", sum);
                    }
                }
            }
            catch (Exception e)
            {
                //tracer.Trace(e.Message);
                throw new InvalidPluginExecutionException("Error returning aggregate value for " + aggregateField + ": " + e.Message);
            }

            return sum;
        }
    }
}

Workflow definition:

Powerapps workflow screenshot

Configuration of custom inputs:

Configuring custom inputs

Testing

After building and registering my workflow assembly, I created expense entries for all expense types ensuring that all expense categories were represented. The following images depict the successful aggregation of payments and balances:

Screenshot of expense output

Screenshot of debt output

Conclusion

Custom workflow activities are a powerful tool that balances the need for a highly maintainable solution after deployment, with complex requirements that need a coded solution. The design gives end users the flexibility to adapt their data collection needs over time as their requirements change, and they can do so with little or no involvement from IT.

As I mentioned above, an alternative approach to this requirement could involve writing a plugin to perform the calculations. This approach would still require some entity-based configurations for flexibility but would suffer from limited end-user configuration needed if or when requirements change. I can also update the custom workflow activity to accept the four aggregate fields as optional arguments to the workflow. Doing so would enable users to run separate workflow processes for each expense type/category, giving them additional configuration control over these automated calculations.

I recently encountered an issue when trying to create an Exact Age column for a contact in Microsoft Dynamics CRM. There were several solutions available on the internet, but none of them was a good match for my specific situation. Some ideas I explored included:

  1. Creating a calculated field using the formula DiffInDays(DOB, Now()) / 365 or DiffInYears(DOB, Now()) – I used this at first, but if the calculated field is a decimal type, then you end up with a value like 23 years old which is not desirable. If the calculated field is a whole number type, then the value is always the rounded value. So, if the DOB is 2/1/1972 and the current date is 1/1/2019, the Age will be 47 when the contact is actually still 46 until 2/1/2019.
  2. Using JavaScript to calculate the Age – The problem with this approach is that if the record is not saved, then the data becomes stale. This one also does not work with a view (i.e., if you want to see a list of client ages). The JavaScript solution seems more geared towards the form of UI experience only.
  3. Using Workflows with Timeouts – This approach seemed a bit complicated and cumbersome to update values daily across so many records.

Determining Exact Age

Instead, I decided to plug some of the age scenarios into Microsoft Excel and simulate Dynamic CRM’s calculations to see if I could come up with any ideas.

Note: 365.25 is used to account for leap years. I originally used 365, but the data was incorrect. After reading about leap years, I decided to plug 365.25 in, and everything lined up.

Excel Formulas

Setting up the formulas above, I was able to calculate the values below. I found that subtracting the DATEDIF Rounded value from the DATEDIF Actual value produced a negative value when the month/day was after the current date (2/16/2019 at the time). This allowed me to introduce a factor of -1 when the Difference was less than or equal to 0.  Using this finding, I set up the solution in CRM.

Excel Calculations

The Solution

  1. Create the necessary fields.
    Field  Data Type  Field Type  Other  Formula 
    DOB  Date and Time  Simple  Behavior: User Local   
    Age Actual  Decimal Number  Calculated  Precision: 10  DiffInDays(new_dob, Now()) / 365.25 
    Age Rounded  Whole Number  Calculated    DiffInDays(new_dob, Now()) / 365.25 
    Age Difference  Decimal Number  Calculated  Precision: 10  new_ageactual – new_agerounded 
    Age  Whole Number  Calculated    See below 
  1. Create a business rule for DOB; setting it equal to birthdate when birthdate contains data. This way when birthdate is set, the DOB is set automatically. This arrangement is necessary for other calculated fields.
    Business Rules
  2. Set up the Age calculated field as follow:
    Age Calculated Field

Once these three steps have been completed, your new Age field should be ready to use. I created a view to verify the calculations. I happened to be writing this post very late on the night of 2/16/2019. I wrote the first part before 12:00 a.m., then I refreshed the view before taking the screenshot below. I was happy to see Age Test 3 record flip from 46 to 47 when I refreshed after 12:00 a.m.

Age Solution Results

Determining Exact Age at Some Date in the Future

The requirement that drove my research for this solution was the need to determine the exact age in the future. Our client needed to know the age of a traveler on the date of travel. Depending on the country being visited and the age of the traveler on the date of departure, different forms would need to be sent in order to prevent problems when the traveler arrived at his or her destination. The solution was very similar to the Age example above:

The Solution

  1. Here is an overview of the entity hierarchy:
    Age at Travel Entities
  2. Create the necessary fields.
    Entity  Field  Data Type  Field Type  Other  Formula 
    Trip  Start Date  Date and Time  Simple  Behavior: User Local   
    Contact  DOB  Date and Time  Simple  Behavior: User Local   
    Trip Contact  Age at Travel Actual  Decimal Number  Calculated  Precision: 10  DiffInDays(contact.dobnew_trip.start) / 365.25 
    Trip Contact  Age at Travel Rounded  Whole Number  Calculated  n/a  DiffInDays(contact.dobnew_trip.start) / 365.25 
    Trip Contact  Age at Travel Difference  Decimal Number  Calculated  Precision: 10  new_ageattravelactual – new_ageattravelrounded 
    Trip Contact  Age at Travel  Whole Number  Calculated  n/a  See below 
  1. Create a business rule for Contact DOB; setting it equal to birthdate when birthdate contains data. This way when birthdate is set, the DOB is set automatically. This arrangement is necessary for other calculated fields.
    Business Rules
  2. Set up the Trip Contact’s Age at Travel calculated field as follow:
    Age at Travel Calculated Field

Once these steps have been completed, your new Age at Travel field should be ready to use. I created a view to verify the calculations.

You’ll notice that in the red example, the trip starts on 8/14/2020. The contact was born on 9/29/2003 and is 16 on the date of travel but turns 17 a month or so later. In the green example, the trip is also on 8/14/2020. The contact was born 4/12/2008 and will turn 12 before the date of travel.

Age at Travel Solution Results

Conclusion

While there are several approaches to the Age issue in Dynamics CRM, this is a great alternative that requires no code and works in real time. I hope you find it useful!

Driving value, lowering costs, and building your organization’s future with Microsoft’s next great business technology

Lately, I’ve been helping folks understand the Microsoft Power Platform (MPP) by sharing two simple diagrams.

The first one is below and is my stab (others have made theirs) at contextualizing the platform’s various components in relation to one another.

The Common Data Service (CDS) is the real magic, I tell people. No matter which app you are using, the data lives there in that one CDS across your entire environment. (And no, folks outside your organization don’t get to use it.) This means that data available to one of your apps can be re-used and re-purposed by your other apps, no wizardry or custom integration required. I promise, it just works. Think expansively about the power of this in your organization, and you’ll come up with some cockamamie/brilliant ideas about what you can do.

These are the types of data-driving-business-function that geeks like me always dreamed of.

A diagram of Microsoft Power Platform components

Then there’s PowerApps, in purple. Most folks think of this as a low-code/no-code app development tool. It is, but it’s more. Imagine that there are three flavors of PowerApps:

  1. Dynamics 365, which in the end is a set of really big PowerApps developed by Microsoft
  2. COTS apps developed by Microsoft partners (including AIS), available for organizations to license and use
  3. Custom apps you build yourself

Point Microsoft PowerBI at all of this, then mash it up with data from outside of your CDS that you get to via hundreds of out-of-the-box connectors, automate it all together with workflows in Flow…and you’ve got Power Platform in a nutshell.

When I’m presenting this to a group, I turn to my next slide pretty quickly at this point.

A rearranged look at Microsoft Power Platform

Here I’ve essentially re-arranged the pieces to make my broader point: When we think about the Power Platform, the emphasis needs to be on the Platform bit. When your organization invests in this technology, say via working with an implementation partner such as AIS or purchasing PowerApps P1/P2 licenses, you’re not just getting a product or a one-off app solution.

What you’re getting is a platform on which to build your modern business. You’re not just extending Office 365. Instead, you’re creating a future where your organization’s data and business processes are deeply integrated with, driving, and learning intelligently from one another.

The more you leverage the platform, the higher the ROI and the lower the marginal costs of those licenses become. A central goal of any implementing partner ought to be guiding organizations on the journey of migrating legacy systems onto the platform (i.e., retiring legacy licensing + O&M costs) and empowering workers to make the platform even more valuable.

We don’t invest in one-off apps anymore, i.e. a CRM in one corner of your network where you run your sales, something in another where you manage your delivery, clunky Human Resources Management off over there where you take care of your people, etc.. No, what we care about here is the platform where you integrate all of the above — not through monolithic one-size-fits-all ERP — but rather through elegant app experiences across all your users’ devices that tie back to that magical Common Data Service.

This is what I mean when I tell folks sky’s the limit, and thinking about your entire business is what’s called for here. It’s because Power Platform gives us the ability to learn and grow with our customers, constituents, vendors, employees, and other stakeholders like never before.

That’s what has everyone at Microsoft so excited. I am as well.

I want to learn from you. How do you make Power Platform understandable to those who haven’t thought about it too deeply? How does your organization make it valuable as a platform rather than just a product? I love to build beautiful things, so inspire me!


If you didn’t catch the first two parts of this series, you can do that here and here.  In this part, we’ll get a little more technical and use Microsoft Flow to do some pretty cool things. 

Remember when we talked about the size and quality of the images we take with our PowerApp and store as the entity image? When saved as the Entity Image for a CDS/D365 item, the image loses quality and is no longer good for an advertisement photo.  This is done automatically and as far as I can tell, the high-res image is gone once this conversion takes place (someone please correct me if I’m wrong on that!).  On the flip side of that, it doesn’t make a whole lot of sense to put all this tech together only to have my end users be required to take two pictures of an item, one for hi-res and one for low-res.  We don’t want to store a high-res in a relational database for 10,000 plus items because the database could bloat immensely.

Microsoft Flow and SharePoint to the rescue!  

PRO TIP:  Dynamics 365 will crop and resize the image before saving it as the entity image.  All entity images are displayed in a 144 x 144 pixel square.  You can read more about this here.  Make sure to save/retain your original image files.  We’re going to stick ours in a SharePoint Picture Gallery App.

Objective 

Create a Microsoft Flow that handles… 

  • Pulling the original image off the Dynamics record and storing it in SharePoint. 
  • Setting the patch image to the Entity Image for the Dynamics record 
  • Create an advertisement list item for the patch 
  • Save the URLs for the ad and image back to the patch record 

Create the Flow 

We’re going to write this Flow so that it’s triggered by a Note record being created. 

 Flow screenshot with Create from blank highlighted

  • On the next page, click “Search hundreds of connectors and triggers” at the bottom of the page. 
  • Select Dynamics 365 on the All tab for connectors and triggers. 
  • Select the “When a record is created” trigger. 

 Dynamics 365 is highlighted

  • Set the properties for Organization Name and Entity Name.  Entity Name should be “Notes”. 
  • Save the Flow and give it a name. 

Verifying a Few Things 

  • Add a new step and select the Condition item. 
  • The Condition should check to see if the Note has an attachment. We do this using the “Is Document” field.  

 Condition Control is highlighted 

  • In the “Yes” side of the conditional we want to check if the Object Type is a Patch (ogs_patch in this case).  

At this point, if the Flow has made it through both conditionals with a “Yes”, we know we are dealing with a new Note record that has an Attachment and belongs to a Patch record.   

Update the Patch Record 

Now we want to update the batch record’s Entity Image field with the attachment.  First we need to get a handle on the Patch record.  We’ll do that by adding an Action to the Yes branch of our new Conditional. 

  • Add a Dynamics 365 Update a Record Action.
  • Set the Organization Name, Entity Name, and Record identifier accordingly.  For our Patch Record identifier, we’ll use the Regarding field in the Dynamic content window. 

 

  • Click on Show advanced options and find the Picture of Patch field. 
  • For the Picture of Patch field we need to get the document body of the attachment and convert it from Base-64 encoding to binary.  We do this using the “Expression” area again.  Use the “base64ToBinary” function to convert the document body like so. 

 

  • Save your work!  I can’t tell you how many times I had to retype that function. 

Create Our SharePoint Items & Clean-up 

Now that we’ve updated our entity image with the uploaded patch picture we want to do a couple of things, but not necessarily in sequence.  This is where we’ll use a parallel branch in our Flow.   

Dealing with a Parallel Branch 

  • Under the last Update a Record action, add a Conditional.  After adding this Conditional hover over the line between the Update action and the new conditional.  You should see a plus sign that you can hover over and select “Add a parallel branch.” 



  • Select this and add a Compose action.  You may need to search for the Compose action. 

 

PRO TIP:  With Modern Sites in SharePoint, we now have three solid options for displaying images in SharePoint.  The Modern Document Library allows viewing as tiles and thumbnails within a document library, the Picture Library which has often been the place to store images prior to the Modern Document Library, and then we can simply just display an image, or images, on a page directly.

Saving the Attachment as an Image in SharePoint

  • Let’s deal with Compose branch first.  Our compose will have the same function as our Picture of Patch did above for the Input field.  base64ToBinary(triggerBody()?[documentbody’]) 
  • After the Compose, we’ll add a Create File Action for SharePoint and use the name from our Patch record as the name for our image in SharePoint.  I’m using a Picture Gallery App in SharePoint and for now, only using the .JPG file type.  The File Content should use the Output from our Compose Action. 

 

Delete the Note

  • Finally, we want to delete that Note from Dynamics (and the Common Data Service) so that the image attachment is no longer taking up space in our Common Data Service.  Add a Dynamics Delete a Record Action after the SharePoint Create file action.  Set the Organization Name, Entity Name, and use the Dynamics content for Note as the Item identifier.

 

Creating Our Advertisement

Let’s jump back to the new Conditional we added after the Update a record Action where we set the entity image. 

  • Set the conditional to check for the Generate Advertisement field being set to true. 
  • If this is true, add a SharePoint Create Item Action and let’s set some values.  What we’re doing here is creating a new SharePoint List Item that will contain some starter HTML for a Patch advertisement. 
  • Save our work! 

 

 

Updating Our Patch Record With Our URLs From SharePoint

  • Under the SharePoint Create Item Action for creating the Ad, AND after the SharePoint Create file action for creating the picture in the Picture Gallery, we’re going to add Dynamics Update record Actions that will be identical with one difference. 
  • The Organization Name, Entity Name, Record Identifier (set to Dynamic Content “Regarding”) should be the same. 
  • On the Ad side, the Update record should set the SharePoint Ad for Patch field to “Link to Item”. 

 

  • On the image side, the Update record should set the SharePoint Image for Patch to the “Path” 

 

Seeing It In Action 

Of course, I’ve been saving my work so let’s go ahead and give this a whirl. 

  • At the top right of your Flow you’ll see a Test button.  We’re going to click that and select “I’ll perform the trigger action.” 
  • To make this more interesting, I’m going to run this from SharePoint! I’ll update a patch and kickoff my Flow from the embedded PowerApps Canvas App on my SharePoint home page. 

 

  • I select the patch, then I click the edit button (pencil icon at the top right). 
  • Notice the Attach file link and the Generate Advertisement switch.  We’ll use the first for our image and the second for generating our ad item in SharePoint. 

 

  • Finally, I click the checkmark at the top right to save my changes.  This kicks off our Flow in less than a minute, and when we navigate back over to the Flow we can see that it completed successfully. 

 Verifying the flow

  • I’ll hop back over to SharePoint to make sure that my ad was created and my entity image was set.  I’ll also make sure the high-quality image made it to the SharePoint Picture Library and the Note was deleted from the Patch record in Dynamics.  I also want to make sure the URLs for the ad and image in SharePoint were set back to the Patch record. 

verifying in SharePoint Verifying in SharePoint image

One last thing: When we store the image in a SharePoint Picture Gallery App we can retain the dimensions, size, and quality of the original image, unlike when storing the image as a Dynamics 365 entity image.  Check out the properties in the next screen shot and compare that to the properties on the SharePoint page in the same screen shot.   


Comparing image file sizes

Conclusion 

I hope you are enjoying this series and continue to tune in as the solution for our dad’s beloved patch collection grows.  I constantly see updates and upgrades to the Power Platform so I know Microsoft is working hard on making it even better. 

Part One: Identify, Define, Build, Migrate

An assortment of fire department patchesMy dad passed away in 2015, leaving behind an extensive collection of fire trucks, patches, and other fire department (FD) memorabilia.  Before he passed, he gave us instructions to sell them and some direction on what to do with the money. After a few years of not really wanting to deal with it, my family decided to make a project out of it.  My mom, sister, wife, two daughters, and I are working our way through thousands of patches, hundreds of fire trucks, and who knows how many pendants and other trinket like items, all while working full-time jobs (school for the kiddos) and from different locations.

Dad was great about logging his patches into a Microsoft Access database, but not so good about taking pictures of them, and even worse at logging his fire trucks and other items.  The objective and high-level steps for this project were quickly identified.

The Objective

  1. Help my mom liquidate my dad’s enormous fire department memorabilia collection.

The High-Level Steps

  1. Identify the technologies to be used. Easy!
    1. Microsoft Dynamics 365 & Common Data Service – our foundation.
    2. Microsoft PowerApps – mobile app for inventory capture.
    3. Microsoft Flow – move data and attachments around, auto-create ads.
    4. Microsoft SharePoint – store ads, images. Keep large files out of CDS.
  2. Complete a first-cut of the data schema and migrate the patches data from the Microsoft Access database.
  3. Configure a software solution for the family to use so we can all capture data to a single database. Solution must be user friendly!
  4. Configure processes that streamline the creation of advertisements and other data processing.
  5. Start capturing data and creating ads!

The Players

Not everyone in an organization has the same skill level and this will certainly lead to some challenges.  With that in mind, let’s look at the players involved in our project.

  1. Mom – Low technical skill – Capable of using anything “Excel-like” to capture data.
  2. Sister – Low-to-Medium – Arguably more advanced than mom, works on a Mac. Enough said.
  3. Wife – Medium – Works around Excel with ease, understands what I do from a high level.
  4. Kids – Low-to-Medium – two daughters, ages 12 and 10. Both are geniuses on any touch device but have no clue how to work around Excel.
  5. Me – High – developer and technology enthusiast!

I’ve spent the better part of my career as a .Net developer working in SharePoint and Dynamics, among other things, so it was easy for me to decide on a path forward.  Let’s get rolling!

Configure Data Schema and Migrate Microsoft Access Data

Just so no one thinks I’m lying here for the sake of this blog, let’s see what my dad was working with back in the day.  Yes, he was ND alum.

Screenshot of patch entry form in Microsoft AccessPatch data in Microsoft Access

Side note: You see that column named “Patch Locator” highlighted in that last screen shot?  My dad kept his patches in old-school photo albums that he then stored in boxes.  This ‘locator’ field was his way of finding the patch once a box was full and stored away.  Genius dad!

As you can see defining the schema for patches was pretty much done.  If we run into anything along the way, we can certainly add it.

  1. In Dynamics I created an un-managed solution named “Fire Department Items Solution” and added two custom entities, “Patch” and “Fire Truck.”
  2. I added all the fields my dad had in his Access database, and then I made sure that the out of box field “EntityImage” was available for displaying an image of the patch.

PRO TIP:  Dynamics 365 only allows you to have one image field on an entity and it is not configured out of the box.  To use this field, create a new field on your entity and use the data type “Image”.  This will automatically set the name of your field to “EntityImage” and the image you set there will be used as your entity image at the top of the entity form.

Screenshot of PowerAppsPowerApps details

  1. Before we save and publish, we need to enable Notes functionality for our entities. To do this select the entity from the left pane in the solution explorer, then make sure the “Notes (includes attachments)” checkbox is selected.

PRO TIP:  When you save an image to the EntityImage filed it loses a lot of its quality.  Because we are using this data for inventory, including creating ads, we don’t want to lose the quality of our images.  For this reason, we will use the attachments collection for our entity to capture the actual high-quality image.  We will then use Microsoft Flow to take that image and store it as the EntityImage (which will lose quality) but also store the high-quality version in a SharePoint library.

PowerApps note functionality

  1. Finally, be sure to publish your customizations.

Migrating the Data

Now it’s time to migrate the data.  Since this was such a simple schema, I opted to use the out-of-box data import functionality that Dynamics 365 provides.  With that said, however, there are a few different ways to accomplish this migration. For me it was easy to simply export the Microsoft Access database to Excel, then use that file to import into Dynamics 365.

    1. Export your data into an Excel file from Microsoft Access.
      1. Export your data into an Excel file from Microsoft Access.
    2. In Excel you’ll want to Save a Copy and save it as a CSV file.
      Save a copy as a CSV file
    3. Open the Patch View in Dynamics and use the out-of-box Import from Excel functionality to load our data.

3. Open the Patch View in Dynamics and use the out-of-box Import from Excel functionality

    1. Choose the CSV file we just created when we saved the copy in Excel.

Choose your CSV file

    1. On this next screen, let’s click the button to Review our Field Mappings.

Review Field Mappings

    1. Here you’ll see some of my fields are mapped and some aren’t. Let’s get those shored up before we proceed.

Resolve mapped items

    1. Now that I’ve resolved all the field mappings, you’ll see we have green check marks across the board and we’re ready to import. Click the Finish Import button and you’re off.

Finish Import button

    1. You can check out the progress of the import by navigating to Settings à Data Management à

View Import progress

Summary & Next Steps

Let’s look at what we’ve done here.  On the surface it would appear we’ve simply gone into Dynamics 365 and configured a couple of entities.  But as we know, Dynamics 365 v9 was built on the Common Data Service (CDS) and that means our Dynamics data is now available to any other application that can connect to the CDS.  Why is this important for this project you might ask?  That answer will become clear in the next part of this blog.  For now, here are some screen shots on how things look now that we have our patch data migrated.

A look at the imported data

Keep in mind, come end of January 2019 everyone will need to switch over to Microsoft’s Unified Interface and that’s what we’re using here for our patches.  This is an example of a model-driven PowerApp which we’ll discuss in our next entry to this blog.

If you log in to your PowerApps environment using the same credentials as your Dynamics 365 environment, you should see your entities and the data migrated in this environment too.  Remember, once it’s in Dynamics, it’s available through the CDS.

A view of the migrated data

One thing to note, if you have 10,000-plus records like I do for patches, CDS in the browser may freeze trying to display them all.  I would hope MS resolves this at some point so that it handles paging and displaying of data as gracefully as the D365 web client does.

Stay tuned for my next entry where we’ll set up our SharePoint Online site, create a simple canvas PowerApp for inventory management on our mobile devices, and then set up a Flow to help move some things around and automate the creation of our online advertisements.

Thanks for reading!

AIS’ work with the NFL Players Association (NFLPA) was showcased as a Microsoft Featured Case Study. This customer success story was our most recent project with NFLPA, as they’ve sought our help to modernize multiple IT systems and applications over the years. We were proud to tackle the latest challenge: Creating a single, shared player management system, using Dynamics 365, for the NFLPA and all its sister organizations.

The Challenge

This case study was featured on Microsoft. Click here to view the case study.As the nonprofit union for NFL players, the NFLPA constantly looks for ways to better serve its members—current and former NFL players—during and after their football careers. But multiple player management systems across the associated support organizations resulted in poor customer service and missed opportunities for NFLPA members. Valuable data captured by one department wasn’t accessible to another, causing headaches and delays when licensing opportunities arose, and limited the organization’s ability to be proactive about the challenges members face after retirement.

The Solution: A Single Source

We used Microsoft Dynamics 365 to create a single, shared player management system, called PA.NET, for all the NFLPA organizations. We customized Dynamics 365 extensively to meet the unique needs of the NFLPA and integrated it with the organization’s Office 365 applications.

At the same time, we shifted all legacy IT systems (websites, financial applications, and others) to Microsoft Azure, giving NFLPA an entirely cloud-based business.

The Results: More Opportunities, More Time, Fewer Costs

With one master set of player data and powerful reporting tools that employees use to find answers to their own questions, the NFLPA can uncover marketing and licensing opportunities for more players and identify other ways to help its members.

Because PA.NET automates so many previously manual processes, it frees up hours of drudge work each week for NFLPA employees, which they convert to creative problem solving for members. And its IT staff has freed up 30 percent more time by not having to babysit infrastructure, time it uses to come up with new technology innovations.

By moving its business systems to the cloud, the NFLPA can scale its infrastructure instantly when traffic spikes—such as when football season ends and licensing offers heat up. No more over-provisioning servers to meet worst-case needs. In fact, no more servers, period. With cloud-based systems, the NFLPA no longer has to refresh six-figure server and storage systems every few years.

Read the full Microsoft Featured Case Study here to learn more about our work and more about great work the NFLPA does on behalf of its members.

SCORE LIKE NFLPA. WORK WITH AIS. Transformation is on the horizon for your organization. All it takes is the right partner. With the experience, talent, and best practices to lead you to success, AIS is the right partner for you.