Sound Familiar?

It’s not a sentiment you would expect from most IT decision-makers. However, it’s something we hear from an increasing number of organizations.

The benefits of a well-thought-out cloud transformation roadmap are not lost on them.

  • They know that, in an ideal world, they ought to start with an in-depth assessment of their application portfolio, in line with the best practice – “migrate your capabilities, not apps or VMs”.
  • They also realize the need to develop a robust cloud governance model upfront.
  • And ultimately, they understand the need to undertake an iterative migration process that takes into account “organizational change management” best practices.

At the same time, these decision-makers face real challenges with their existing IT infrastructure that simply cannot wait months and years for a successful cloud transformation to take shape. They can’t get out of their on-premises data centers soon enough. This notion isn’t limited to organizations with fast-approaching Data Center (DC) lease renewal deadlines or end of support products, either.

So, how do we balance the two competing objectives:

  • Immediate need to move out of the DC
  • Carefully crafted long-term cloud transformation

A Two-Step Approach to Your Cloud Transformation Journey

From our experience with a broad range of current situations, goals, and challenges, we recommend a two-step cloud transformation approach that addresses both your immediate challenges and the organization’s long-term vision for cloud transformation.

  1. Tactical “Lift-n-Shift” to the Cloud – As the name suggests, move the current DC footprint as is (VMs, databases, storage network. etc.) to Azure
  2. Strategic Cloud Transformation – Once operational in the cloud, incrementally and opportunistically move parts of your application portfolio to higher-order Azure PaaS/cloud-native services

Tactical “Lift-n-Shift” to the Cloud

Lift n Shift Approach to Cloud Transformation

On the surface, step #1 above may appear wasteful. After all, we are duplicating your current footprint in Azure. But keep in mind that step #1 is designed for completion in days or weeks, not months or years. As a result, the duplication is minimized. At the same time, step #1 immediately puts you in a position to leverage Azure capabilities, giving you tangible benefits with minimal to no changes to your existing footprint.

Here are a few examples of benefits:

  • Improve the security posture – Once you are in Azure, you tap into security capabilities such as intrusion detection and denial of service attack solely by being in Azure. Notice that I deliberately did not cite Security Information and Event Management (SIEM) tools like Azure Sentinel. Technically you can take advantage of Azure Sentinel for on-premises workloads.
  • Replace aging hardware – Your hardware may be getting old but isn’t old enough for a Capex-powered refresh. Moving your VMs to Azure decouples you from the underlying hardware. “But won’t that be expensive, since you are now paying by usage per minute?” you ask. Not necessarily and certainly not in the long run. Consider options like Reserved Instance (RI) pricing that can offer an up to 80% discount based on a one- or three-year commitment.

Furthermore, you can combine RI with Azure Hybrid Benefits (AHUB) which provides discounts for licenses already owned. Finally, don’t forget to take into account the savings from decreased needs for power, networks, real estate, and the cost of resources to manage all the on-premises assets. Even if you can’t get out of the DC lease completely, you may be able to negotiate a modular reduction of your DC footprint. Please refer to Gartner research that suggests that over time, the cloud can become cost-effective.

AMP Move out of Data Center

Source – https://blogs.gartner.com/marco-meinardi/2018/11/30/public-cloud-cheaper-than-running-your-data-center/

  • Disaster Recovery (DR) – Few organizations have a DR plan setup that is conducive for ongoing DR tests. Having an effective DR plan is one of the most critical responsibilities of IT. Once again, since geo-replication is innate to Azure, your disks are replicated to an Azure region that is at least 400 miles away, by default. Given this, DR is almost out-of-the-box.
  • Extended lease of life on out of support software – If you are running an Operating System (OS), such as Windows Server 2008 or SQL Server 2008, moving to Azure extends the security updates for up to three years from the “end of support” date.
  • Getting out of the business of “baby-sitting” database servers – Azure managed instances offer you the ability to take your existing on-premises SQL Server databases and move them to Azure with minimal downtime. Once your database is an Azure SQL Managed Instance, you don’t have to worry about patching and backup, thereby significantly reducing the cost of ownership.
  • Take baby steps towards automation and self-service – Self-service is one of the key focus areas for most IT organizations. Once again, since every aspect of Azure is API driven, organizations can take baby steps towards automated provisioning.
  • Get closer to a data lake – I am sure you have heard the quote “AI is the new electricity”. We also know that Artificial Intelligence (AI) needs lots and lots of data to train the Machine Learning (ML) algorithms. By moving to Azure, it is that much easier to capture the “data exhaust” coming out the applications in a service like Azure Data Lake. In turn, Azure Data Lake can help turn this data into intelligence.

Strategic Cloud Transformation

Strategic Cloud Transformation

Once you have completed step #1 by moving your on-premises assets to the cloud, you are now in a position to undertake continuous modernization efforts aligned to your business priorities.

Common approaches include:

  • Revise – Capture application and application tiers “as-is” in containers and run on a managed orchestrator like Azure Kubernetes Service. This approach requires minimal changes to the existing codebase. For more details of this approach, including a demo, read Migrate and Modernize with Kubernetes on Azure Government.
  • Refactor – Modernize by re-architecting to target Platform as a Service (PaaS) and “serverless” technologies. This approach requires more significant recoding to target PaaS services but allows you to take advantage of cloud provider managed services. For more information, check out our “Full PaaS” Approach to Modernizing Legacy Apps.
  • Rebuild – Complete rewrite of the applications using cloud-native technologies like Kubernetes, Envoy, and Istio. Read our blog, What Are Cloud-Native Technologies & How Are They Different from Traditional PaaS Offerings, for more information.
  • Replace – Substitute an existing application, in its entirety, with Software as a Service (SaaS) or an equivalent application developed using a no-code/low-code platform.

CHECK OUT OUR WHITEPAPER & LEARN ABOUT CLOUD-BASED APP MODERNIZATION APPROACHES

The following table summarizes the various approaches for modernization in terms of factors such as code changes, operational costs, and DevOps maturity.

Compare App Modernization Approaches

Azure Migration Program (AMP)

Microsoft squarely aligns with this two-step approach. At the recent Microsoft partner conference #MSInspire, Julia White announced AMP (Azure Migration Program).

AMP brings together the following:

Wrapping Up

A two-step migration offers a programmatic approach to unlock the potential of the cloud quickly. You’ll experience immediate gains from a tactical move to the cloud and long-term benefits from a strategic cloud transformation that follows. Microsoft programs like AMP, combined with over 200+ Azure services, make this approach viable. If you’re interested in learning more about how you can get started with AMP, and which migration approach makes the most sense for your business goals, reach out to AIS today.

GET YOUR ORGANIZATION ON THE RIGHT TRACK TO TRANSFORMATION. CONTACT AIS TODAY TO DISCUSS YOUR OPTIONS.

How to use galleries to create dynamic entries in a data source in PowerApps

In this article, we will see how we can use galleries in PowerApps to create multiple rows for adding records to a data source. We will create dynamic entries in a gallery that looks like a form and adds/deletes a line/row with the press of a button.

Scenario: XYZ Inc. is a sales company that deals in sales of hardware components from the manufacturers to retailers. User A is an on-field sales agent of XYZ Inc. and uses a static application to enter the order details from a customer. This application is further connected to a SharePoint list and creates a new item on the list whenever User A enters the detail and hits the submit button. The application provides the ability to enter only one order detail at a time and User A ends up putting more effort and time in entering those details.

We designed a customized PowerApp for XYZ Inc. where User A authenticates and lands on the Order Details page. User A can view all their previous entries, search for an order by entering the name of the customer, vendor, invoice number, etc. Functionality to add details is provided within the app. User A clicks the add new orders button and a form gallery is displayed. User A can add multiple records by creating new lines with the press of a button in the form gallery. A local collection with all the entries on the form is created in PowerApps. Once User A hits the “Finish & Save” button, an item for each entry is created on the SharePoint List and the Order Details gallery is updated with these newly added records.

Let’s look at the component-wise description of the controls in the app. The schema for data on the SharePoint List is:

S.No Column Name Column Data Type
1 Title (Order Number) Single Line of Text (255 Chars)
2 Customer Single Line of Text (255 Chars)
3 Shipping Address Single Line of Text (255 Chars)
4 Billing Address Single Line of Text (255 Chars)

On the App -> OnStart option, the expression used is:

ClearCollect(DynamicGallery,{Value:1}); Clear(OrderCollection); Collect(OrderCollection,Filter(OrderDets,StartsWith(Title,"Order1"))); Collect(OrderCollection,Filter(OrderDets,StartsWith(Title,"Order2"))); Collect(OrderCollection,Filter(OrderDets,StartsWith(Title,"Order3"))); Collect(OrderCollection,Filter(OrderDets,StartsWith(Title,"Order4"))); Collect(OrderCollection,Filter(OrderDets,StartsWith(Title,"Order5"))); Collect(OrderCollection,Filter(OrderDets,StartsWith(Title,"Order6"))); Collect(OrderCollection,Filter(OrderDets,StartsWith(Title,"Order7"))); Collect(OrderCollection,Filter(OrderDets,StartsWith(Title,"Order8"))); Collect(OrderCollection,Filter(OrderDets,StartsWith(Title,"Order9")))

Explanation: Here, I am creating a collection “Dynamic Gallery” and this is the number of rows corresponding to the gallery control for creating the new orders. I am creating another collection “OrderCollection” and this collection contains all the Order Details from the SharePoint List named “OrderDets”.

Note: The “StartsWith” function is not delegable if a variable is passed as the second argument which is the reason why I am using multiple “Collect” statement to iterate over all possible values.

Galleries to create dynamic entries in a Data Source in PowerApps1

  1. This icon is the Home Page icon and clicking on this navigates the user to the home screen
  2. This icon is the Order Details Screen icon and clicking on this navigates the user to the Order Details Screen
  3. This icon is the Edit an Item icon and clicking on this allows the user to edit a particular item
  4. This icon is the Refresh Icon and clicking on this refreshes the data source, the expression used here is:

Refresh(OrderDets);Clear(OrderCollection); Collect(OrderCollection,Filter(OrderDets,StartsWith(Title,"Order1"))); Collect(OrderCollection,Filter(OrderDets,StartsWith(Title,"Order2"))); Collect(OrderCollection,Filter(OrderDets,StartsWith(Title,"Order3"))); Collect(OrderCollection,Filter(OrderDets,StartsWith(Title,"Order4"))); Collect(OrderCollection,Filter(OrderDets,StartsWith(Title,"Order5"))); Collect(OrderCollection,Filter(OrderDets,StartsWith(Title,"Order6"))); Collect(OrderCollection,Filter(OrderDets,StartsWith(Title,"Order7"))); Collect(OrderCollection,Filter(OrderDets,StartsWith(Title,"Order8"))); Collect(OrderCollection,Filter(OrderDets,StartsWith(Title,"Order9")))

Explanation: This refreshes the data source (“OrderDets” SharePoint List). It also clears the existing data from the “OrderCollection” collection and refills it with the new data.

  1. This is a gallery control that populates all the items of the SharePoint list
    • The expression used in the “Text” property of the “Order Details” label is:

"Order Details Total Orders Count:"&CountRows(OrderCollection)

Explanation: This expression concatenates the simple text (wrapped in “”) with the integer returned as a result of the “CountRows” function applied on the “OrderCollection” collection.

    • The expression used in the “Items” property of the Gallery is:

Sort(Filter(OrderCollection,If(!IsBlank(TextInput3.Text),StartsWith(Title,TextInput3.Tex t) ||
StartsWith(Customer,TextInput3.Text),true)),Value(Last(Split(Title,"r")).Result),Descen ding)

  1. This is a stack of Text labels used to show the information when an item is selected in the “Order Details” gallery. Expressions used on the labels:

Customer: Gallery5.Selected.Customer, Shipping Address: Gallery5.Selected.'Shipping Address', Billing Address: Gallery5.Selected.'Billing Address'

Explanation: Each line is an individual expression that fetched the attributes of the item selected in the “OrderDetails” gallery (Gallery5).

  1. This is a button control that enables the gallery control for the user to create dynamic lines and enter the order details. Expression used on this button:

ClearCollect(DynamicGallery,{Value:1});Set(NewOrder,true);Set(ResetGallery,false);Set(ResetGallery,true)

Explanation: Here I am recreating the “DynamicGallery” collection to accommodate just one value that corresponds to one row of the newly visible dynamic control gallery. I am setting up two new variables “NewOrder” and “ResetGallery” that control the visibility/reset of this dynamic gallery control, “Total Number of New Orders” and the “Finish and Save Button” controls.

Use Galleries to create dynamic entries in a Data Source in PowerApps 2

  1. This is the dynamic gallery control that I customized for user inputs. This gallery control has four text input controls to get the values for each of the attributes of the SharePoint List. The user can create multiple lines (one at a time) to add multiple records in one go. Data from each line is directly patched to the data source to create a new item. The user can remove the line by clicking the “X” icon. Configuration of the elements of the gallery control:

Gallery Properties:
Items: DynamicGallery, Visible: NewOrder,

Explanation: “DynamicGallery” is the collection that holds count of the orders to be added. “NewOrder” is a variable used to set the visibility of the controls.

  1. This icon is to remove the current row from the dynamic gallery. The expression used on this is:

Icon Properties:
OnSelect: Remove(DynamicGallery,ThisItem), Visible: If(ThisItem.Value <>
Last(Sort(DynamicGallery,Value,Ascending)).Value,true,false)

Explanation: We are removing the current item from the gallery by pressing this button (“OnSelect”) property. This icon’s visibility is set in a way that it shows up only if the current row is not the last item of the dynamic gallery.

  1. This icon is to add a new row/ line to the dynamic gallery. The expression used on this is:

Icon Properties:
OnSelect: Collect(DynamicGallery,{Value: ThisItem.Value + 1}), Visible: If(ThisItem.Value =
Last(Sort(DynamicGallery,Value,Ascending)).Value,true,false)

Explanation: We are adding a row/line by adding an item to the dynamic gallery collection by pressing this button (“OnSelect”) property. This icon’s visibility is set such that it shows up only on the last item of the dynamic gallery.

  1. This button is to perform the patch action on the “OrderDets” SharePoint List and it patches all the entries made by the User A in the dynamic gallery. The expression used in this control is:

ForAll(
Gallery3_1.AllItems,
Concurrent(
Patch(OrderDets,Defaults(OrderDets),{Title:TextInput2_6.Text,Customer:TextInput2 _7.Text,'Shipping Address':TextInput2_4.Text,'Billing Address':TextInput2_5.Text}), Patch(OrderCollection,Defaults(OrderCollection),{Title:TextInput2_6.Text,Customer: TextInput2_7.Text,'Shipping Address':TextInput2_4.Text,'Billing
Address':TextInput2_5.Text})));
ClearCollect(DynamicGallery,{Value:1});
Set(NewOrder,false);
Refresh(OrderDets)

Explanation: In this control, the concurrent function executes two patch commands, one in the data source (“OrderDets” SharePoint List) and the other on the local collection (“OrderCollection”) based on the inputs by the User A in each of the line/ row of the dynamic gallery. The “DynamicGallery” collection is being reset to hold a single value. The variable “NewOrder” is set to “false” to toggle the visibility of the dynamic gallery and then we finally refresh the data source.

Note: We are doing a concurrent patch instead of refreshing and recollecting the data in the collection “OrderCollection” from the data source to optimize the operations in the app.

  1. This is the text label control that displays the total number of current lines/rows User A has created. The expression used here is:

"Total Number of New Orders: "& CountRows(Gallery3_1.AllItems)

Explanation: Here the text “Total number of New Orders” is being concatenated with the number of rows of the dynamic gallery.

Use Galleries to create dynamic entries in a Data Source in PowerApps 3

  1. This is the text input control of the dynamic gallery. Here I am validating the text input and checking through the “OrderCollection” if the entered Order Number already exists. If it exists, the user will get an error notification. The expression used in the “OnChange” property of this control is:

If(TextInput2_6.Text in OrderCollection.Title,Notify("Order Number already exists!",NotificationType.Error))

Explanation: Here the if condition checks if the text of the text input control exists in the “Title” column of the “OrderCollection” collection and pops an error message if the condition is met.

  1. This is the error notification generated when the user enters an existing order number in the text input.

Use Galleries to create dynamic entries in a Data Source in PowerApps 4

  1. This is the App settings page of the app where we are increasing the soft limit of the data row limit on of non-delegable queries from 500 to 2000.

In this article, I have shown a basic implementation of the dynamic galleries concept and the multiple items/ records patch function for a SharePoint data source. This can be replicated with minor changes in the expressions for other data sources such as CDS, excel, SQL, etc.

I hope you found this interesting and it helped you. Thank you for reading!

AIS is the 2019 MSUS Partner Award Winner – Business Applications – Dynamics 365 for Sales. This is our vision for the Power Platform era.

I am incredibly excited to share that AIS has been announced as the 2019 MSUS Partner Award Winner – Business Applications – Dynamics 365 for Sales at #MSInspire!

Some background on how we won:

Story for MSUS Win Dynamics 365 SalesWhen the National Football League Players Association (NFLPA) needed to score a big win for its members, they brought in the AIS team to build a single, shared player management system, called PA.NET. AIS extensively customized Dynamics 365 for Sales to meet the unique needs of NFLPA, integrated it with Office 365… and then took it all to the cloud with Microsoft Azure.

Using Dynamics 365 for Sales, PA.NET provides one master set of player data and powerful reporting tools. Now employees across the organization can turn to the same system to answer questions, uncover marketing and licensing opportunities, and identify other ways to help members. When a specific licensing request comes in, they can find the right person, or people, in minutes.

So where do we go from here? From Dynamics to Power Platform.

Our Business Applications & Automation Practice is investing heavily in Dynamics and the Power Platform. We recognize that an organization’s adoption of the Power Platform should be thought of as a journey, not a one-off “app of the moment” solution. By focusing on enterprise management and leveraging the Common Data Service (CDS) as much as possible, we help clients like NFLPA scale their adoption as they migrate workloads and make use of PowerApps, Power BI, Flow, and Dynamics 365.

Power Platform Technologies

Earlier this year, we worked with friends in the business applications community around the world to launch our Power Platform Adoption Framework. Mature organizations realize that rigor, discipline, and best practices are needed to adopt the platform at scale.

The Power Platform Adoption Framework is the start-to-finish approach for adopting the platform at scale.

It helps enterprise organizations:

  • Get to value quickly
  • Educate, train, and grow their community of developers and power users
  • Create durable partnerships between business, IT, and the user community
  • Continuously improve ROI on the platform by identifying and migrating new workloads
  • Blend agile, rapid app development with rigorous, disciplined enterprise management

I hope that the framework will continue to become a worldwide standard for enterprise-grade adoption of the Power Platform. I’ve been lucky to collaborate with Power Platform experts and users around the world to create the Power Platform Adoption Framework. I’m proud to say that AIS is fully behind the framework, sharing it with the community, and committed to its future development as best practices for scaled adoption evolve. We’re sharing it so that everyone can use it because we believe that a vibrant and thriving community around this technology is good for everyone who uses it.

Please join me in congratulating the AIS team, and please join us on this journey to scale the Power Platform to meet the challenges of the years to come.

A Single Place to Manage, Create, and ConsumeAzure Monitor and OMS

The integration of the Operations Management Suite (OMS) into Azure Monitor is completed for both Azure Commercial and Azure Government. This change by Microsoft has given Azure Monitor/OMS users a single place to manage, create, and consume Azure Monitoring solutions. No functionality has been removed and documentation has been consolidated under the Azure Monitor documentation. With this consolidation of services, there have been some terminology changes that will impact the way one talks about Azure Monitor components. The consolidation of OMS and other Azure services into Azure Monitor is simplifying the way you manage the monitoring of your Azure services.

Updated Terminology

Microsoft has updated some of the terminologies for the Azure Monitor components to reflect the transition from OMS. I have highlighted some examples:

  • The log data for Azure Monitor is still stored in a Log Analytics Workspace, but the term Log Analytics in the Microsoft documentation is now Azure Monitor Logs.
  • The term log analytics now applies to the page in the Azure portal used to write and run queries and analyze log data.
  • What was once known as OMS Management solutions have been renamed Monitoring solutions (items like Security & Compliance and Automation & Control)

Azure Monitor — Your 1 Stop “Monitoring & Alerting” Shop

Azure Monitor is now pretty much the one stop shop for your monitoring and alerting needs (the exception here would be Azure Security Center is still the place to go to for most of your security and compliance needs).

Azure Monitor is broken out into four main categories in the Azure Portal:

  1. The main components of Azure monitor
  2. Insights
  3. Settings
  4. Support + Troubleshooting.

The main components include the Activity log, Alerts, Metrics, Logs, Service Health, and Workbooks.

Under Insights, there is Application, Virtual Machines, Containers, Network, and “…More”.

The Settings category includes Diagnostics settings and Autoscale.

And finally, under Support + Troubleshooting, there is Usage & estimated costs, Advisor recommendations, and New support request.

Check out the below table that provides an overview of the Azure Monitor Components and Descriptions:

Azure Monitor Component Description
Overview Overview of Azure Monitor
Activity Log Log data about the operations performed in Azure
Alerts Notifications based on conditions that are found in monitoring data both metrics and logs
Metrics (Metrics Explorer) Plotting charts, visually correlating trends, and investigating spikes and dips in metrics’ values.
Logs (Azure Monitor Logs) Useful for performing complex analysis across data from a variety of sources
Service Health Provides a personalized view of the health of the Azure services and regions you’re using
Workbooks Combine text, Analytics queries, Azure Metrics, and parameters into rich interactive reports.
Applications Application Performance Management service for web developers
Virtual Machines Analyzes the performance and health of your Windows and Linux VMs and monitors their processes and dependencies on other resources and external processes.
Containers Monitor the performance of container workloads deployed to either Azure Container Instances or managed Kubernetes clusters hosted on Azure Kubernetes Service (AKS).
Network Tools to monitor, diagnose, view metrics, and enable or disable logs for resources in an Azure virtual network.
More Replacement for the OMS Portal Dashboard.
Diagnostic Settings Configure the diagnostic setting for Azure resources (formally known as Diagnostic Logs)
Autoscale Consolidated view of Azure resources that have Autoscale enabled
Usage and estimated costs Consumption and cost estimates of Azure Monitor
Advisor Recommendations Link to Azure Advisor
New support requests Create a support request

Passing just about anything from PowerApps to Flow with the newly released JSON function

In this article, I will show you how we can send data from a Canvas App using the freshly released JSON function. I will pass data from the data table (of a SharePoint List), microphone (audio recording), and camera control (photo) to an MS Flow. A condition logic is set up in Flow to check the file type and create those accordingly in a dedicated SharePoint Library.

This article focuses on a canvas app and a flow. We will look at the component-wise structuring of both the app and the flow to achieve the objective.

Canvas App

Let’s look at the control-wise screens and functions used in the Canvas App.

  1. Data from a SharePoint list is displayed on a Gallery control in the app. A user can export this data to a PDF file and save it to SharePoint Document Library, and download it in the browser window.

Gallery Control

Here, we have a Gallery (‘Gallery2’) control that is populated with the data from a SharePoint List. The data is filtered to show only the first 10 records. The expression used on the ‘Items’ property of the Gallery control is:

FirstN(ShowColumns(OrderDets,"Title","Customer","ShippingAddress","BillingAddress"),10)

Explanation: Get the first 10 items from the ‘OrderDets’ SharePoint list and get the columns as specified.

The ‘Create PDF’ button creates a local collection and then triggers an MS Flow and passes the collection as an argument along with the desired file name using the JSON function. Finally, once the PDF is created and the Flow is executed successfully, the PDF file is opened in a new tab of the browser. The expression used on this button is:

ClearCollect(PDFCollection,Name:Concatenate("Test123",Text(Today()),".pdf"),Url:JSON(ShowColumns(Gallery2.AllItems,"Title","Customer","ShippingAddress","BillingAddress"))});Launch(CreateFilesSharePoint.Run(JSON(PDFCollection,IncludeBinaryData)).responsereturned)

Explanation: The ‘ClearCollect’ function creates a collection named ‘PDFCollection’ and this stores the data in the gallery control and the name of the PDF file. The name of the PDF file is a concatenated string with the naming convention of ‘Test123-today’s date.pdf’. The ‘URL’ key inside the ‘PDFCollection’ stores string type value for the table formatted Gallery items, using the JSON function. This value is later parsed as JSON while sending as an argument to the Flow. The ‘Launch’ function opens a new browser window to launch the newly created PDF file’s URL received as a response from the ‘CreateFilesSharePoint’ flow.

  1. The Microphone control on the app is used to record audio. Multiple recordings can be created and played/viewed on the gallery control.

Microphone Gallery Control

Here, we have a Microphone control ‘Microphone1’ to record the audio inputs and store that into a local collection ‘AudioCollection’. The Expression used on the ‘OnStop’ property of the Microphone control is:

Collect(AudioCollection{Name:Concatenate("Audio",Text(Today()),Text(CountRows(AudioCollection)),".mp3"),Url:Microphone1.Audio})

Explanation: The ‘Collect’ function updates a collection ‘AudioCollection’ to store the audio recordings with the unique file name. The filename is a concatenated string of ‘Audio-Today’s date-index of the audio file.mp3’.

The ‘Submit’ button triggers the Flow and creates all the audio recordings as separate files on the SharePoint document library. The Expression used on this button is:

CreateFilesSharePoint.Run(JSON(AudioCollection,JSONFormat.IncludeBinaryData))

Explanation: Here the JSON function converts the audio file URL to binary data and sends the ‘AudioCollection’ data to the ‘CreateFilesSharePoint’ flow.

The ‘Clear’ button clears data from the ‘AudioCollection’.

  1. The camera control is used to click photos in the canvas app. Multiple pictures can be captured and viewed on the Gallery control.

Camera Gallery Control

Here, we have a camera control ‘Camera1’ to capture a picture and store it into a local collection ‘ImageCollection’. The Expression used on the ‘OnSelect’ property of the Camera control is:

Collect(ImageCollection,{Name:Concatenate("Image",Text(Today()),"-",Text(CountRows(ImageCollection)),".jpg"),Url:Camera1.Photo})

Explanation: Collect function updates the ‘ImageCollection’ collection with the unique file name and the URL of the photo taken from the camera control. The name of the file is a concatenated string of ‘Image-Today’s Date-Index of the photo in the gallery control.jpg’.

The ‘Submit’ button triggers the Flow and creates all the images as separate files on the SharePoint document library. The Expression used on this button is:

CreateFilesSharePoint.Run(JSON(ImageCollection,JSONFormat.IncludeBinaryData))

Explanation: Here, the JSON function converts the image file URL to binary data and sends the ‘ImageCollection’ data to the ‘CreateFilesSharePoint’ flow.

The ‘Clear’ button clears data from the ‘ImageCollection’.

MS Flow

Coming to the ‘CreateFilesSharePoint’ flow: This flow is triggered by the button controls on the different screens in the Canvas App.

Action 1: Initialise a variable -> accommodates the input coming from the canvas app.

Action 2: Initialise a variable (2) -> To get the string to send a response back to the canvas app.

Action 3: Parse JSON: Get the dynamic data by parsing the data received from the canvas app according to the schema where we have an array that contains objects with the attributes: ‘Name – Filename’, ‘URL – Filecontent’.

Flow 1

Action 4: Apply to Each control: Iterate over each file item from the body output of the Parse JSON function.

Action 5: Condition control within the Apply to each Control: Split the file name and check if the extension is a PDF file.

If No,

Action 6: Create File 2 in SharePoint: to create a file for the image/ audio type in the defined library. If Yes,

Action 7: Parse JSON 2: The data content passed from the PowerApps as the URL key is now being parsed as individual elements to create an HTML table and then finally create a PDF file out of it.

Action 8: Create HTML Table: Creates an HTML table with the column names as headers and gets the data from the Parse JSON 2 action.

HTML Table from Parson JSON

Action 9: Create File in OneDrive: To create a temporary HTML file from the HTML table generated in the previous step and store it in the ‘Hello’ folder on the OneDrive.

Action 10: Convert File in OneDrive: To convert the previously created HTML file to a PDF document.

Action 11: Create File 2 in SharePoint: To create the PDF file from the converted file from the previous action. The file is stored in the specified document library on SharePoint.

Action 12: Delete File from OneDrive: To delete the temporary HTML file that was created in Action 9.

Action 13: Get file Properties SharePoint: To get the URL of the PDF file created in SharePoint.

Action 14: Set Variable: Set the URL to the file as a string value.

Create and Transform Files

Action 15: Respond to PowerApps: Send the URL of the file created on SharePoint to PowerApps. (Outside of the apply to each control)

Respond to PowerApps

In this blog, we have seen how we can use the JSON function to pass data from PowerApps to Flow. We were able to successfully send binary data (image files, audio recordings) and a gallery data table. We can also send collections, data directly from data sources with appropriate filters, etc. The attributes that can be sent via the JSON function does not support sending attachments, nested arrays/objects.

I hope you found this interesting and this helped you. Thank you for reading!

Most of the time, deploying database scripts is tricky, time-consuming, and error-prone — specifically when a script fails due to mismatched schema, missing prerequisite data, dependencies, or any other factor.

Thankfully, different tools can automate and simplify the process…one of which is SQL Change Automation from Red Gate.

What is SQL Change Automation?

Put simply, SQL Change Automation (SCA) allows you to develop and deploy changes to a SQL Server database. It automates validation and testing, which can be performed on build and release management systems such as Azure DevOps, TeamCity, Octopus Deploy, Bamboo, and Jenkins.

Installation & Required Tools

  • Download SQL Toolbelt
  • Run the .exe and select only “SQL Change Automation 3.0” and “SQL Change Automation PowerShell 3.1
  • Visual Studio 2015/2017
  • Azure DevOps

Automated Deployment with SCA & ADO CI/CD

Create an SCA Project

  1. First, create a new SQL Change Automation project by clicking the Create Project button from SQL Change Automation menu under Tools.
  2. Select the Development (source) and Deployment Target Databases. SCA will detect the differences and create a baseline script.
    Note: This baseline script is created from the selected Target database
  3. The next step is to identify the source database changes that need to be scripted and deployed to the target database. For this, click on the Refresh button in the SQL Change Automation tab. It will list the database objects which are different from the target database.
  4. After selecting the required objects, click the Import and Generate Scripts button. It will automatically generate all the required scripts.
  5. Now go ahead and build the solution!

Setup GIT Repository in DevOps

  1. Create a new repository and upload the SCA project.
  2. Create a new Feature branch and commit the changes to this repo. Raise a pull request and assign the reviewer.
  3. Once the pull request is approved and marked complete, the changes will automatically merge to the master branch.

Setup CI/CD Pipeline in DevOps

  1. Create a new Build pipeline, select the repo.
  2. Add a new build task Redgate SQL Change Automation: Build and configure it.
    Note: This extension must be first installed into your Azure DevOps organization before using it as a task in the build flow
  3. Save the pipeline and queue a new build.
  4. Next setup a Release pipeline, create a new Release, and select the Build artifact as the input.
  5. Add a new Release task Redgate SQL Change Automation: Release and specify the configuration details like operation type, build package path, target SQL instance, database name, and credentials.
    Note: This extension must be first installed into your Azure DevOps organization before using it as a task in the release flow
  6. Save the Release pipeline and trigger a new Release.
  7. Once the Release is successful, connect to the target database and verify if the new database objects are deployed.
    Note: The target database server can be in Azure or on-premises.

Here a short video on how to configure SCA and integrate with Azure DevOps CI/CD pipeline.

Automated Rollback Using SCA & ADO CI/CD

Rolling back a database deployment is a complicated task. The code on other fronts is rather easy to rollback — just deploy the previous version of the code package and done. But databases are not as flexible. Imagine there’s an error in a script and all usernames get deleted. There isn’t a good way to roll that back! Sure, a backup could be restored. But when was that backup taken? Have any new users been in the system since that backup? What data will be lost if the backup is restored?

The process needs to be thought through right before the deployments to ensure an effective rollback process. The steps below walk through a simple example of how a rollback can be applied in an automated manner using SCA with CI/CD.

  1. First, create a new folder in your SCA solution and name it Rollback. Add your rollback scripts to this folder.
  2. While creating migration scripts (i.e., UP script), also create Down scripts. To create rollback scripts, right-click the database object and select View Revert Script option.
  3. Save the script in a new file and save it under the Rollback folder.
    Note: This rollback script will not be executed as part of the deployment.
  4. If there are any issues post-deployment, copy this rollback script to the Migration folder. Insert the Metadata and save the script.
  5. Commit the script to GIT and complete the pull request.
  6. Raise a new Build and let Release to complete.
  7. Once successful, verify the changes.

Here is a short video on how to perform rollback with SCA generated scripts.

Key Terms:

  • Baseline: The schema of the Deployment Target will be read to create a baseline schema.
  • Shadow Database: SCA keeps the shadow database consistent with all the migration scripts currently in the project as needed, and uses it to verify scripts to detect problems in your code.
  • (Table) [__MigrationLog] keeps track of the migrations and Programmable Objects/additional scripts that have been executed against your database. (Additional executions of Programmable Objects/additional scripts will result in new rows being inserted.)
  • (View) [__MigrationLogCurrent] lists the latest version of each migration/Programmable Object/additional script to have been executed against the database.