Key Challenges of Latency Testing in Applications

Engineer working on his computer

With the software market growing more and more competitive, functionality is just not enough anymore when it comes to gaining a competitive edge. In fact, the software market is expected to show an annual growth rate of 5.42% between 2023–2028. Therefore, to get a leg up on competitors, software companies and developers must focus on non-functional aspects of their application, such as latency. Latency refers to the time it takes for an application to do or respond to a certain action, like sending a message or opening and starting a video, after the user has given the command to do so.

Latency is an important part of ensuring a great user experience, as users have high expectations, especially when it comes to audio and video quality in video conferencing solutions. Users expect the person in the call to answer immediately or for the video to load as soon as they select it. And with over 58% of organizations admitting to using video conferencing solutions daily and many of these solutions providing a wide range of features and capabilities, it’s important to cater to users’ needs—or risk losing them. To ensure your software meets users’ expectations in terms of speed and responsiveness while meeting software requirements described in documentation, latency testing is essential. In fact, based on our recent survey published in the State of Software Quality Assurance 2023 report, video latency is the most common issue found in audio and video applications. However, latency testing is not always an easy task and software testers often come face to face with various challenges.

This blog post will explore 7 key challenges of latency testing, with a focus mainly on mobile application testing, as latency for web applications can be measured by reading elements, while for mobile applications, that is not reliable enough. For mobile applications, the main focus is the latency between selecting a video and starting the video playback.

Challenge #1: Automated and manual testing

QA engineer performing latency testing

Latency testing typically involves two types of testing approaches—test automation and manual testing. When automating latency testing we need to validate the data to establish the credibility of our results. And that's where the tedious task of manually checking the results from the recording comes into play. We use a Quicktime or Elmedia player for validation to check the time it takes for the application to complete a desired action and compare it to the automated processing script results to see if it falls into specified deviation values. For example, if the deviation is 0.1 seconds, the difference between the manual check and automated script cannot exceed the value, and both negative and positive values are valid.

As the faith in the validity of our test results is established, validation can take a secondary role, and even be automated to catch failed cases that have slipped through the automation checks.

The main benefit of using automated tests for latency testing is the speed in which the tests are done. However, you need to review and tweak your test scripts to ensure they are credible. For manual testing, manual verification is done in the test process.

Challenge #2: Determining the correct time to select a video

One of the challenges of latency testing in mobile applications for video playback is determining when to select the video in order to get the most accurate result. The reason this is a challenge is because the animations for different applications are not the same, so it is hard for processing scripts to determine the correct time of selection and can lead to inaccurate results. Specifically, starting to record the test process as the video is selected can lead to milliseconds lost in the recording.

With the right approach to latency testing and measurements, this challenge can be overcome. Usually, we use one of two ways to decide when to select video. First, the recording can be started as the video is selected, and then the video start is determined by the movement of the video. The problem with this approach, however, is that the start of the recording can be delayed, making the latency times unreliable. The second way is to see when the device sends a command for the recording to start and for video to be selected. Using this approach, we can determine the length that needs to be trimmed from the test recording for it to start at the exact time the video is selected. We found that the problem with this approach is the time it takes for the device to send the command and for the command to be executed is too long for our latency testing purposes, as our aim is to achieve results with millisecond precision which is not possible with such an approach.

To overcome this challenge and determine the correct time to select a video, we can use a script to find a text on the page when a video is not selected and start the countdown for video startup latency as it disappears. It starts recording and finds a text visible on the screen at the start of the recording. Then, when the video is selected, it takes up the whole screen and the text has disappeared.

It is also valuable to know that in latency testing for mobile applications for video playback, there is more than just the number when the video starts to play. There can be stalls or dips in quality and applications can perform much worse in different network conditions. We can see an example of this below.

0:00
/0:03

Challenge #3: Comparing different applications and platforms

Ensuring an apples-to-apples comparison between different applications can be a challenge when performing latency testing. This is because different applications have different user journeys, which makes it difficult to compare them accurately.

The same goes for comparing different platforms, such as Android and iOS, as they can have different application versions and, therefore, yield different results.

When presenting test analysis results, it is important to communicate the differences between operating systems and applications.

Challenge #4 Dealing with limitations

The best way to test latency is in a controlled and restricted network, as the application behavior can change drastically when the network is not ideal. We call these network limitations because we limit the network speeds to test different aspects of the application.

For example, with an internet speed of 1 megabit per second, it can take up to several seconds for the video to start or the video may fail to load altogether. Also, video stalling can become obvious as the video stutters as it loads. Application developers counter this issue by decreasing the video quality if they detect low network speeds.

Low network speeds do not only affect the test recording, but also the automated test process. For instance, elements can take too long to load or not load at all. Also, in our experience of latency testing, we have seen applications crash on lower limitations.

At TestDevLab, we recently developed the Netembox limiter, an access point that allows a limited network to be used in our test environments. This access point is used to test all sorts of devices and their performance under variable network conditions—which is a great solution to overcome limitations commonly experienced by QA engineers.

Challenge #5: Testing different test scenarios

To fully understand the difference between applications, it is important to consider different test scenarios and network limitations. Even applications that perform extremely well when the network is up to speed, may struggle when network conditions differ from usual, for instance, in crowded environments or when using applications in the countryside.

You might be interested in: How We Test Applications in Motion: Introducing Our Mobile Laboratory

We have derived four main test scenarios that we use for testing applications for video playback—cold startup, deep link, swipe scenario and upload.

Cold startup

This test scenario focuses on video playback when the app is either freshly installed or all data has been cleared. The scenario is fairly simple: open the app, then find the testing profile, and finally open the video required for testing.

Performing this test shows us the time needed for applications to open and start playing the video. Also, we can see how the quality of the video differs on different network limitations, specifically what applications do to speed up the starting process.

The main challenge that we see in this test scenario is the application preloading videos, either starting to load the first video when visiting the profile page or in some cases even downloading the last video watched when the application is launched. But avoiding these behaviors can impact the performance of the tested app.

Also, the performance and startup latency can vary greatly depending on the operating system of the device, in our case Android or iOS.

Deep link is a way for us to test video applications in a way that prevents them from using any performance-enhancing tricks like preloading. It is a standard test used to understand the baseline performance of a video startup, ensuring that no cold or warm video startup is worse.

Deep link means that the link is opened in the application, not on the web. This scenario is even more simple, the only difference between this one and the cold startup scenario is that after launching the application, a link to the video is opened in the app and the video is played. For Android this is done fairly simply, by passing an adb command to open the deep link. To conduct a similar test on iOS devices, a custom app could be used to pass the link to the application.

Swipe scenario

In swipe scenarios we test how well multiple videos start one after another by swiping to the next video after a set amount of time. Latency is recorded from swipe start to when the video starts to play. The amount of videos in the test can differ depending on the needs of the test, but we typically play 10 videos in a row to test the latency.

Upload

For a good user experience, it is important to measure the time it takes for a video to be uploaded onto the application’s servers. The main challenge for this scenario is creating a clear way to determine when the upload has finished, as different applications can have very different ways of showing that the upload has been completed.

Challenge #6: Video and audio scrubbing

Scrubbing refers to the action of jumping forward during video or audio playback—but testing the latency of this feature can be challenging. To test this feature, we need to identify the moment when the motion to a different time slot in the video is initiated and then for the second point, we would use the point where the video resumes playing. With a precise, automated test process we could calculate the duration of the jump by treating it as a pause in the overall recording of the video playback.

Scrubbing is also useful for validating test results, as it helps identify the exact time when the video frames start to move.

Challenge #7 Latency in messaging applications

Software tester performing video latency testing

Measuring latency in messaging applications is another common challenge we frequently come across. To determine the latency in messaging applications, we need to establish the time lapse between two specific events. To calculate the time it takes for a message to fully load in the chat, we first need to calculate the time between the launch of the app and the time it takes for the app to fully load. Also, we need to note the time that has passed from clicking on the chat to when the message loads in the chat. This process aims to minimize the human effect on time, such as clicks and other actions. Times can vary and be subjective between different test engineers. A manual process is used for recording. Testers go through all events and record them on an external device. Recordings contain the full test flow including the actions that are dependent on the tester, called raw test videos.

In the analysis phase, each video is manually observed on the media player with the ability to go frame by frame and to observe timestamps with a precision of 1 millisecond. The QA engineer then proceeds to write down the timestamps of the action start and end times. The precision of the timestamp is from 1 to 2 frames (around 13 to 26 milliseconds on average). Time elapsed is then calculated automatically.

One of the main challenges of testing latency in messaging applications is variability. Namely, even for a single application there is not a specific time standard, especially if the network is limited. So every scenario has to be repeated several times to understand the overall trend.

When performing competitor analysis, a similar challenge is present. Not every application works the same and even the same application can work differently on different operating systems. So navigating them or their operations creates complexity in data analysis. It is very important to communicate the differences and nuances in them when doing the data comparison. The load times are dependent on the device, network restrictions, app navigation, and other factors.

Key takeaways

In this article, we looked at the importance of latency testing and its challenges. We can conclude that latency testing is an effective way to determine the competitiveness and user experience of the software, as it shows the speed at which it responds to user actions or events after said actions.

The main challenges encountered by QA engineers when performing latency testing are related to network limitations and different network conditions, different test scenarios, and ensuring an apples-to-apples comparison between different applications, as their user journey can differentiate.

To overcome challenges related to network and user scenarios, QA engineers should test the mobile application under different network conditions and cover various testing scenarios.

Differences between comparable applications can be overcome with more complex data analysis and making sure the endpoints of the tests between different competitors are as close in operation as possible.

Do you need help performing latency testing for your video conferencing, video-on-demand or messaging application? We can help. Contact us with your project details to learn how we can help you get one step ahead of your competitors.

Subscribe to our newsletter

Sign up for our newsletter to get regular updates and insights into our solutions and technologies: