Tuesday, April 16, 2019

Measuring Latency in Complex PowerApps

By Steve Endow

I have created a PowerApp that has several dependencies, and I am concerned about latency, and how that will affect the delay that a user experiences when they use the PowerApp.

In order to understand the latency, I have designed my PowerApp-based solution to measure and display elapsed times for key operations that are performed when a user clicks a button.

Here's a video demonstrating the design and the underlying code:

When I say "Complex" PowerApp, I am referring to an application or solution that has multiple dependencies or performs complex operations that may result in a noticeable delay for the user.

When building a sample PowerApp recently that utilized Flow and Azure Functions, I was concerned about how long a process would take to complete, and whether the multiple steps in the process might introduce problematic latency.

In this example, I've created a PowerApp that calls a Flow, which in turn calls an Azure Function, which finally queries an Azure SQL Database.

How much latency is added when connecting several services?
It's great that the Power Platform and Azure let's me easily and quickly build this serverless solution, but how much latency is added when we string together these four services?

To find out, I created a Passphrase Generator in PowerApps.

This application will generate a complex passphrase based on a few option, such as number of words, max word length, and whether I want to include numbers, symbols, or spaces.

Notice the numbers in the bottom right corner of the application:

The application displays the total time it took PowerApps to respond to the click of the "Generate" button.  And it also displays how long it took the underlying Flow to run.  And last, it shows me how long it took for the Azure Function to run.

Based on this example, it looks like the Azure Function was reasonably responsive at 300ms, Flow took a bit longer and added 600ms to the process, and PowerApps itself added an additional 900ms to the overall process.

I need to do a bit more research and testing to determine if my measurement techniques are sound and if they are producing accurate figures.  I'm fairly confident in the Azure Functions metric, as that uses a .NET Stopwatch component.  But there may be some opportunity for refinement with how I am capturing elapsed times in Flow and PowerApps.

So here's how I did it.


This is the function that is used on the Generate button in the PowerApp.

If (Symbols.Value, Set(symbolsValue, 1), Set(symbolsValue, 0));  
If (Spaces.Value, Set(spacesValue, 1), Set(spacesValue, 0));  
If (Misspell.Value, Set(misspellValue, 1), Set(misspellValue, 0));  
Set(startTime, Now());  
Set(PassphraseValue, PowerAppsGeneratePassphrase.Run(WordCount.Text, WordLength.Text, If (numbersValue = 1, MaxNumber.Text, "0"), symbolsValue, spacesValue, misspellValue));  
Set(endTime, Now());  
Set(Duration, endTime - startTime);  

Three lines are used to measure the elapsed time.

Set(startTime, Now());
<call Flow>
Set(endTime, Now());
Set(Duration, endTime - startTime);

With those variables populated, the application then then display the time in milliseconds with this forumula:

"PowerApps:   " & Round(Duration * 24 * 60 * 60 * 1000, 0) & " ms"

This appears to capture the total time from button click to completion of the click event.

This is the measurement that I am the least confident about.  I don't know exactly how PowerApps executes formulas, and if it uses any asynchronous operations.  Based on my tests so far, the numbers produced by this method seem pretty reasonable and consistent.  Even if the numbers are not absolutely correct, it seems like they are generally representative of the time required to complete the process, and when compared to the elapsed times returned by the Azure Function and Flow, seem to offer a reference for how long the PowerApp operation is taking.


To measure the run time of the Flow, I used three variables.

As soon as the Flow starts, I store utcNow() in a variable called flowStartTime.

At the end of the Flow, I initialize two more variables.

flowEndTime also stores utcNow().

flowElapsedTime has a formulate to calculate the difference between the start and end times.

div(sub(ticks(variables('flowEndTime')), ticks(variables('flowStartTime'))), 10000)

This converts the two times into ticks units, subtracts the start time from the end time, then divides by 10,000 to convert the ticks into milliseconds.

The Flow then returns that value back to PowerApps as a variable called flowElapsedTime.

Azure Function

The last measurement comes from my Azure Function that actually builds the passphrase.  I always like to include elapsed time in my APIs, as it allows me to monitor the API performance and helps identify the source of performance problems.

Since the Azure Function is written in C#, it is able to utilize the .NET Stopwatch object to measure its start and end times.  The Azure Function returns its elapsed time to the Flow as part of the response body.

This is very easy to do in .NET.  Near the top of the Azure Function code, I start a Stopwatch.

And at the end, I stop the Stopwatch, then store the Elapsed time in milliseconds, and save it to my response object.

With these elapsed time metrics from Azure Functions, Flow, and PowerApps, I'm now able to get a feel for how long each step takes in the process of generating a passphrase in my application.

Competing with On-Premises Windows Applications

While a 2 second delay may be tolerable for a simple Passphrase Generator app in PowerApps, I consider it a rather long delay.  This test has made me wonder about the viability of using a PowerApp for any application that is even remotely sensitive to delay or latency.

For comparison, this is an add-on window for Microsoft Dynamics GP that I developed in .NET and WinForms.  Notice the elapsed times displayed in the bottom right corner.

This window can retrieve 200 records from a local SQL Server in 43 milliseconds, and display them in a grid in a total of 334 milliseconds.

Think about that:  200 records delivered to the user in 334 milliseconds.

Yes, it is a completely different architecture than a PowerApp solution, but I think it's something to keep in mind.  While I might love to create a neat PowerApp for a customer, I need to look at the components that will be required for the solution and consider the end user experience to determine if PowerApps will be fast enough.  And now I can measure the elapsed times and latency in PowerApps to make that assessment.

Steve Endow is a Microsoft MVP in Los Angeles.  He works with Dynamics GP, Dynamics 365 Business Central, SQL Server, .NET, Microsoft Flow, and PowerApps.

You can also find him on Twitter and YouTube

No comments:

Post a Comment

All comments must be reviewed and approved before being published. Your comment will not appear immediately.

How many digits can a Business Central Amount field actually support?

 by Steve Endow (If anyone has a technical explanation for the discrepancy between the Docs and the BC behavior, let me know!) On Sunday nig...