Part One: Identify, Define, Build, Migrate

An assortment of fire department patchesMy dad passed away in 2015, leaving behind an extensive collection of fire trucks, patches, and other fire department (FD) memorabilia.  Before he passed, he gave us instructions to sell them and some direction on what to do with the money. After a few years of not really wanting to deal with it, my family decided to make a project out of it.  My mom, sister, wife, two daughters, and I are working our way through thousands of patches, hundreds of fire trucks, and who knows how many pendants and other trinket like items, all while working full-time jobs (school for the kiddos) and from different locations.

Dad was great about logging his patches into a Microsoft Access database, but not so good about taking pictures of them, and even worse at logging his fire trucks and other items.  The objective and high-level steps for this project were quickly identified.

The Objective

  1. Help my mom liquidate my dad’s enormous fire department memorabilia collection.

The High-Level Steps

  1. Identify the technologies to be used. Easy!
    1. Microsoft Dynamics 365 & Common Data Service – our foundation.
    2. Microsoft PowerApps – mobile app for inventory capture.
    3. Microsoft Flow – move data and attachments around, auto-create ads.
    4. Microsoft SharePoint – store ads, images. Keep large files out of CDS.
  2. Complete a first-cut of the data schema and migrate the patches data from the Microsoft Access database.
  3. Configure a software solution for the family to use so we can all capture data to a single database. Solution must be user friendly!
  4. Configure processes that streamline the creation of advertisements and other data processing.
  5. Start capturing data and creating ads!

The Players

Not everyone in an organization has the same skill level and this will certainly lead to some challenges.  With that in mind, let’s look at the players involved in our project.

  1. Mom – Low technical skill – Capable of using anything “Excel-like” to capture data.
  2. Sister – Low-to-Medium – Arguably more advanced than mom, works on a Mac. Enough said.
  3. Wife – Medium – Works around Excel with ease, understands what I do from a high level.
  4. Kids – Low-to-Medium – two daughters, ages 12 and 10. Both are geniuses on any touch device but have no clue how to work around Excel.
  5. Me – High – developer and technology enthusiast!

I’ve spent the better part of my career as a .Net developer working in SharePoint and Dynamics, among other things, so it was easy for me to decide on a path forward.  Let’s get rolling!

Configure Data Schema and Migrate Microsoft Access Data

Just so no one thinks I’m lying here for the sake of this blog, let’s see what my dad was working with back in the day.  Yes, he was ND alum.

Screenshot of patch entry form in Microsoft AccessPatch data in Microsoft Access

Side note: You see that column named “Patch Locator” highlighted in that last screen shot?  My dad kept his patches in old-school photo albums that he then stored in boxes.  This ‘locator’ field was his way of finding the patch once a box was full and stored away.  Genius dad!

As you can see defining the schema for patches was pretty much done.  If we run into anything along the way, we can certainly add it.

  1. In Dynamics I created an un-managed solution named “Fire Department Items Solution” and added two custom entities, “Patch” and “Fire Truck.”
  2. I added all the fields my dad had in his Access database, and then I made sure that the out of box field “EntityImage” was available for displaying an image of the patch.

PRO TIP:  Dynamics 365 only allows you to have one image field on an entity and it is not configured out of the box.  To use this field, create a new field on your entity and use the data type “Image”.  This will automatically set the name of your field to “EntityImage” and the image you set there will be used as your entity image at the top of the entity form.

Screenshot of PowerAppsPowerApps details

  1. Before we save and publish, we need to enable Notes functionality for our entities. To do this select the entity from the left pane in the solution explorer, then make sure the “Notes (includes attachments)” checkbox is selected.

PRO TIP:  When you save an image to the EntityImage filed it loses a lot of its quality.  Because we are using this data for inventory, including creating ads, we don’t want to lose the quality of our images.  For this reason, we will use the attachments collection for our entity to capture the actual high-quality image.  We will then use Microsoft Flow to take that image and store it as the EntityImage (which will lose quality) but also store the high-quality version in a SharePoint library.

PowerApps note functionality

  1. Finally, be sure to publish your customizations.

Migrating the Data

Now it’s time to migrate the data.  Since this was such a simple schema, I opted to use the out-of-box data import functionality that Dynamics 365 provides.  With that said, however, there are a few different ways to accomplish this migration. For me it was easy to simply export the Microsoft Access database to Excel, then use that file to import into Dynamics 365.

    1. Export your data into an Excel file from Microsoft Access.
      1. Export your data into an Excel file from Microsoft Access.
    2. In Excel you’ll want to Save a Copy and save it as a CSV file.
      Save a copy as a CSV file
    3. Open the Patch View in Dynamics and use the out-of-box Import from Excel functionality to load our data.

3. Open the Patch View in Dynamics and use the out-of-box Import from Excel functionality

    1. Choose the CSV file we just created when we saved the copy in Excel.

Choose your CSV file

    1. On this next screen, let’s click the button to Review our Field Mappings.

Review Field Mappings

    1. Here you’ll see some of my fields are mapped and some aren’t. Let’s get those shored up before we proceed.

Resolve mapped items

    1. Now that I’ve resolved all the field mappings, you’ll see we have green check marks across the board and we’re ready to import. Click the Finish Import button and you’re off.

Finish Import button

    1. You can check out the progress of the import by navigating to Settings à Data Management à

View Import progress

Summary & Next Steps

Let’s look at what we’ve done here.  On the surface it would appear we’ve simply gone into Dynamics 365 and configured a couple of entities.  But as we know, Dynamics 365 v9 was built on the Common Data Service (CDS) and that means our Dynamics data is now available to any other application that can connect to the CDS.  Why is this important for this project you might ask?  That answer will become clear in the next part of this blog.  For now, here are some screen shots on how things look now that we have our patch data migrated.

A look at the imported data

Keep in mind, come end of January 2019 everyone will need to switch over to Microsoft’s Unified Interface and that’s what we’re using here for our patches.  This is an example of a model-driven PowerApp which we’ll discuss in our next entry to this blog.

If you log in to your PowerApps environment using the same credentials as your Dynamics 365 environment, you should see your entities and the data migrated in this environment too.  Remember, once it’s in Dynamics, it’s available through the CDS.

A view of the migrated data

One thing to note, if you have 10,000-plus records like I do for patches, CDS in the browser may freeze trying to display them all.  I would hope MS resolves this at some point so that it handles paging and displaying of data as gracefully as the D365 web client does.

Stay tuned for my next entry where we’ll set up our SharePoint Online site, create a simple canvas PowerApp for inventory management on our mobile devices, and then set up a Flow to help move some things around and automate the creation of our online advertisements.

Thanks for reading!

2017 was another great year overall here at AIS, and also marked the fifth anniversary of our blog! We hope you enjoyed reading and found our posts helpful and interesting. We’re all pretty passionate about what we do here, and look forward to sharing more thoughts, insights and solutions in 2018 and beyond!

As we close out the year, here are the top 10 most read and shared blog posts of 2017:

1) Office 365 Groups vs. Microsoft Teams by Jason Storch

2) Lift & Shift: Migrating Legacy Applications to Azure Cloud by Nasir Mirza

3) Dockerization of Azure PaaS (Beyond Azure Container) by Vishwas Lele

4) Managed Images in Azure (Create & Deploy) by Justin Baca

5) Building Stateless Microservice Using Microsoft Service Fabric Series by Kasi Srinivasan

6) Azure PaaS Options: When to Use What? by Vishwas Lele

7) A three-way tie (!) for Parts One, Two & Three of Automated Deployments with Azure Resource Manager Templates, Azure Automation, & Octopus Deploy by Harun Davood

8) It’s Time to Review the Failure Modes of Your #cloud App(s) by Vishwas Lele

9) Pattern Matching vs. Deep Learning by Vishwas Lele

10) A Fix for the SharePoint Search Query/Result Mismatch by Clint Richardson

Happy New Year to all our readers and bloggers! Be sure to follow AIS on Twitter, Facebook or LinkedIn so you’ll never miss a post.

My decision to join AIS six years ago was a revelation. After almost seven years spent working as an embedded IT analyst for various government customers, I joined AIS to support a customer who was implementing SharePoint.  I soaked up everything I could about this (at the time) brave new world of SharePoint. I loved it.

SharePoint 2003 had been available for use in my previous office where I had initially set up out-of-the-box team sites for working groups to support a large department-wide initiative. I found it empowering to quickly set up sites, lists and libraries without any fuss (or custom coding) to get people working together. Working with my new team, I gained insight into what we could do with this tool in terms of workflow, integration and branding. It got even better when we migrated to SharePoint 2007.  We made great strides in consolidating our websites and communicating to those who were interested exactly what the tools could do in terms of collaboration and knowledge management.

This ability for a power user to quickly create a variety of new capabilities exposed a deeper customer need – easier communications with IT.  While we had all this great expertise and firepower to create and maintain IT tools and services, our core customer base did not have an easy way to quickly and reliably communicate their needs in a manner that matched their high operational tempo. It was a problem. We needed a way for our customers to quickly and easily communicate with us in order to really hear what they needed to meet their mission goals and work more effectively. Read More…

The mission was critical, and the task complex: Ushering a print publication like Rolling Stone, a Bondi publication, into the digital age by providing them with a turnkey solution to present their print magazine archives online, for viewing on high resolution-connected devices of all shapes and sizes. Unlike other digital versions of magazines on the market, the new platform would allow the publisher to monetize its own unique brand through the years.

The Challenge

AIS needed to address multiple technical areas to provide the most viable solution for Rolling Stone and other archives.

  • Speed
  • Scale
  • Security
  • Continuous Delivery
  • High-Definition Presentation

Rolling Stone’s existing solution (‘print to digital’) only supported 100 issues, and AIS was challenged to multiply the output tenfold while reducing management costs and allowing Rolling Stone the freedom to design their interface in a way that matched and enhanced their other digital presences.

Solution

AIS implemented a multi-tier solution in the creation of the Bondi Archive Platform. The platform we built consists of a flexible, scalable website architecture using HTML5, JavaScript, ASP.NET MVC and SQL Azure. The system allows viewers to see exact replicas of the original print issues, something not offered by any other platform. Users can navigate using a mouse, keyboard or touch and can zoom in or conduct complex searches. Bookmarking inside the archive is available as is print, based on the publisher’s preference.

Results

The new archive platform has allowed Rolling Stone to join the online revolution and bring their print content online in its original format and context, thereby retaining copyrights. By serving the content directly from their own website or from the cloud, the company can avoid content restrictions and fees imposed by third-party aggregation platforms and app stores. The project and platform has been an unequivocal success. Rolling Stone chose to enhance its print subscriptions by offering the full digital archive at no extra charge. By tightly integrating their online content with the digital archive, a deeper level of interaction with readership has been realized.

I vividly remember the iconic scene from the 1995 box office hit Apollo 13 where a team of NASA engineers gathered around a table with a collection of mishmash spaceship junk. From this collection, the team had to create a square air filter to fit in a round receptacle so that the astronauts would not asphyxiate on CO2 in space. It’s an intense, life-or-death scenario of literally making a square peg fit in a round hole, where “failure is not an option.”

Working as a business analyst for our federal government clients means that budget, time, and resource constraints almost always play major role in any development effort. This challenge requires our team to use bit of ingenuity and a mixed bag of tools to create a solution for our customers. Read More…

Workflow in SharePoint 2013 has undergone quite the architectural change from its SharePoint 2010 ancestor.  I documented many of the major changes in a previous blog post, “What Changed in SharePoint 2013 Workflow? Pretty Much Everything.”  While SharePoint 2013 is backwards-compatible with SharePoint 2010 workflows, you may decide that the benefits of the new design are needed.  The purpose of this post is to illustrate the new considerations you’ll need to keep in mind when targeting SharePoint 2013 workflows. The SharePoint 2010 project we’ll use for this example is the one from my very first AIS blog post, “Developing Multi-Tiered Solutions for SharePoint.”

Web API Methods Mockup screen shot
Figure 1 - Sample WebAPI methods for Section Document Merge and Post-Merge Actions.

In our example project there are actually two workflows, SectionDocumentApprovalState (SDAS) and MasterDocumentApproval (MDA). The MDA checks if the various SDAS-related sections have been merged and finalized, then notifies specific users for approval of the final document. An instance of SDAS is created for each section, created from the Master Document that monitors the editing and approval of the specific section. We’ll focus on just the SDAS workflow. In the previous post, I referred to the workflows as being part of the Presentation Layer and the custom code called into the Business Layer.  Both of these layers will change in a SharePoint 2013 workflow solution.

Read More…

Our work with Rolling Stone and Bondi Digital Publishing is yet another example of how AIS can develop technology that creates new revenue streams for publishers. We built a digital distribution platform to usher print publications like Rolling Stone into the digital age – by providing them with a turnkey solution to deploy print magazine archives online for viewing on desktops, laptops and mobile devices. For Rolling Stone, the initial launch included more than 1,000 issues from 1967 to the present.

Click here to read more about the distribution platform and how we customized it for the Rolling Stone archives. 

In this blog I’ll discuss some post-release reporting issues that we faced for one of our projects and the solutions we implemented. On the technology side, we had SQL Server 2008 R2 and MVC 4.0 application (which were hosted in Amazon Web Services) in our production environment.

The Problem

The post-production release reporting system was not responding as per the user expectations. For most of the high-volume reports (50K rows to 200K rows in report output), we were getting request timeout error. Client SLA for response time was two minutes; hence any report (big or small) must return data within two minutes. All the reports were designed using SQL Server Reporting Services 2008 R2. In all there were close to 40 reports with such timeout issues. Read More…

I recently completed a large document management system on SharePoint 2010 that used FAST Search and claims-based authentication. The client wanted to secure and limit access to customer-specific documents based on data coming from their CRM system.

We decided to implement a custom claim provider that would query the CRM system at login for customer claims based on the user ID. On upload (based on the customer that was assigned to the document), we used the content organizer to route the document to the correct site, library and folder based on the organization and security rules that we had. Each library had a claim for the customer assigned to it so only users with that claim could view the documents in the library. We would use search for the UI so that the users had a single place to find and view the documents. Sounds simple, right?

It should’ve been.

Unfortunately, the implementation was anything but simple. From the beginning, we hit the core limits of SharePoint 2010, FAST and Claims. Now that we’ve made it to the end, I want to talk about the limits we ran into and steps you can take in your design to avoid them. Read More…

Swiss army knife

Here at AIS, we’ve found Windows Azure Blob Storage to be an inexpensive, fast hosting solution for non-text or server-side loaded resources. But what if we want to use client-side JavaScript to load HTML fragments or JSON data directly from blobs? Under normal circumstances this is prevented by JavaScript’s Same Origin Policy; that is, you can’t load HTML fragments or JSON from another domain, subdomain, port or protocol.

One commonly used solution to this restriction is JSONP, but this is not available with Azure Blob Storage. Another modern option is Cross-Origin Resource Sharing (CORS), but it is also unavailable on Azure Blob Storage and not supported in some legacy browsers.

We could consider a server-side solution, such as employing an Azure Web Role to read text-based content from blob storage and serve it up from the original server. But this approach can be both wasteful and performance inhibiting.

Read More…