Measuring What Matters: How to Achieve the End-User Experience

Finding the right metric to report on end-user performance can be a challenge given the menu of metrics collected by the many synthetic monitoring products in the market.  In the past, two metrics dominated synthetic benchmarking. The first is the time to download an entire page, excluding browser pauses to execute JavaScript or other browser components. The second is the time to download an entire page inclusive of the time the browser stopped to load a plug-in or execute JavaScript. These two metrics were the most accepted standards that legacy synthetic providers like Keynote and Dynatrace advocated.  The problem with these metrics is that they really don’t convey much context on what the user actually experienced on the page. Furthermore, user-configured, arbitrary page completion like counting the number of ready states from IE, or waiting for two or three seconds of network inactivity, counted the page as complete, even though other components still had to load.  When measuring just a single page, it can give you a false sense of speed when the wrong page completion method is selected.

With the modern browsers came the inclusion of the navigation timing specification, allowing synthetic and real-user measurements to track more than the full completion by extracting this information via an API. The new information gave insight into each stage of a web page load by tracking Document Object Model (DOM) events, including the loadEventEnd, when a page theoretically becomes interactive to the user.  

 

The problem with these metrics is that they are not indicative of the experience the user has.  Many modern sites have asynchronous calls that occur after the loadEventEnd that are critical to the rendering and functionality of the page; or the loadEventEnd has to account for all the content that is still downloading and executing below the fold of the page.  The question remains, when did the user perceive to have a completed page?

WebPageTest has introduced a new metric designed to address how quickly a page appears complete for an end-user, giving us a performance number for the time to render a page above the fold. User perception of speed is impacted by how quickly they see the content they are looking for. Based on an algorithm that looks at how the screen renders as measurements are taken, an AppDynamics Browser Synthetic monitor can report back the visually complete time for each page visited throughout a journey. Let’s look at an example comparing two retailers:

 

 

 

In this example, Retailer Two does a much better job of optimizing and presenting content to the end-user, gaining a performance edge over the competition. However, if we used the traditional synthetic metrics like loadEventEnd and Fully Loaded, we would erroneously think that Retailer Two was delivering a worse performance to their end-users and lagging behind the competition.

 

The browser synthetic tests afford us a deeper understanding of performance metrics, given we can directly observe the behavior of each page load as it happens, including when a user sees that their page has fully loaded. When using synthetic measurements as competitive benchmarks, it’s important to focus on the user-impacting metrics and not legacy numbers that may mislead about the experience of the end-user. The conversation needs to be moved from “fully loaded” and “time to loadEventEnd” to “What did the user actually experience above the fold?” To compare one site versus another, visually complete is the metric that truly reflects the end-user experience.

Diving Into AppDynamics Mobile APM & End User Monitoring

In the AppDynamics Spring 2014 release we greatly improved our End User Monitoring solution and added Mobile Application Performance Monitoring for iOS and Android applications. Through our end user experience dashboard you can understand the end user experience of your users in real-time via browsers or mobile devices:

 

We greatly improved visibility into the geography drill down breaking down requests by web vs mobile and  iOS vs Android:

We added more granular client-side metrics with a new waterfall timing visualization for browser snapshots:

 

We also added server-side correlation for browser snapshots for end to end visibility from the client-side to the server-side:

We enhanced our browser snapshots with a better drill down for discovering javascript errors:

We also expore all client-side metrics in the metrics browser to easily track and correlate over time:

AppDynamics added support for iOS and Android applications with complete end user monitoring, crash reporting, network request snapshots and device and user analytics:

 

Through instrumenting network requests from mobile apps to the server side you gain end to end visibility from device all the way to the server-side with Network Request Metrics. With AppDynamics Mobile Snapshots with Server-side Correlation you can correlate network errors and exceptions on the server-side:

With AppDynamics Mobile APM Crash Reporting you can proactively detect and respond to application crashes, hangs, and failed network requests and discover crash metrics by Device, Geo, App and OS Version etc. Gain complete visibility with iOS Crash Reporting with symbolication:

With User & Device Analytics you can understand your users with breakdown by device, connection, carrier, os version, app version

Through the AppDynamics metric browser you can track custom  metrics and get complete visibility into the performance of  your mobile apps:

Take five minutes to get complete visibility into the performance of your production applications with AppDynamics today.

Why did your Checkout Fail? AppDynamics knows

The reason for this blog is purely down to a real-life incident which one of our e-commerce customers shared with us this week. It’s based around a use case that pretty much anyone can relate to – the moment your checkout transaction spectacularly fails. You sit there, looking at a big fat error message and think “WTF – did my transaction complete or did the company steal my money?” A minute later you’re walking a support team through exactly what happened: “I just clicked Checkout and got an error…honestly…I waited and never got a response.”

What’s different in this story is that the support team had access to AppDynamics as they were talking to a customer on the phone…and the customer got to find out the real reason their checkout failed. How often does that happen? Never, until now. Here is the story as documented by the customer.