Automated testing is a great way to increase your test coverage, run repetitive tests faster, and improve overall software development efficiency. In fact, according to the State of Quality Report 2022, over half of the software testing teams surveyed said that they had implemented automation in testing, while 63% of respondents reported that they cut costs and saved time by implementing test automation.
However, though there are many benefits and reasons to choose automated audio and video testing, continuous updates and improvements made to the app and/or automation approach can cause tests to fail. As a result, this may delay the gathering of test results. To avoid this, test automation maintenance is essential. This process is an integral part of test automation and is necessary to make sure your tests are up to date with changes in the device software, application, tools, or frameworks.
In this blog post we will look at test automation maintenance, specifically in audio and video testing, and go over the key challenges you might encounter when maintaining setups.
What is test maintenance?
As mentioned above, test maintenance plays a critical role in test automation. It refers to the process of keeping tests healthy and running after an update is made to the application or a change to the automation approach by rewriting or updating existing tests.
Test maintenance includes, but is not limited to, fixing newly occurring bugs in tests, improving the functionality/performance of existing solutions, and changing the solution to better suit the needs of the client.
Why is it needed in automated audio and video testing?
The main reason why test maintenance is important in automated audio and video testing—and automated testing in general—is reactive and proactive change. What this means is that test maintenance makes sure that test results are being gathered consistently and remain comparable even after changes or improvements have been made to the application under test or to a part of the automation approach itself.
How to maintain automated audio and video test setups?
To properly maintain automated setups in audio and video testing, there are four main steps you need to follow:
- Create thorough documentation for how the test setups should be set up for testing (both software and hardware).
- Check gathered test results with a validation system—a manual, semi, or fully automated one—to flag outlier results for further investigation.
- Summarize the outlier results using either new or existing reports/graphs, grouping them by issue type.
- Make adjustments to automation scripts, and evaluation/recording tools, based on any changes in apps under test, requirements, or other necessities.
When validating and investigating gathered data, you may come to realize that an otherwise small update may require a bigger change in the overall process, and this can sometimes be unpredictable. Therefore, it’s hard to provide a bulletproof guide for test automation maintenance in audio and video testing. Some of the key challenges with maintenance are described in the next section.
What are some of the challenges in test maintenance?
- Changes in test application UI and behavior. Automation scripts must be updated when the app (or browser) under test changes a UI option or selector/locator for some elements. In general, even minor updates can cause indirect changes to app behavior. Hence, test automation scripts should also be updated.
- Changes in test device software. Updates to the OS of the device can affect app and device behavior, which can also affect the testing approaches, as well as the tools used. This will vary from platform to platform.
- Changes in testing tools. Updates to automation tools like Appium/Selenium and their OS-specific drivers can prompt a change in test automation scripts. You may be required to update the test automation framework in order to use the latest testing tool features.
- Tools are no longer compatible. Tools might drop support for some functionality used in the automation code, so change is required in order to avoid using deprecated solutions. Some tool changes may even affect other tools, for example, in video testing, the newest web browser version may break compatibility with streaming software.
- Keeping results comparable. Sometimes customers may require you to run tests on multiple apps, comparing the main app with competitor apps. However, the challenge with this is that if, for example, a competitor releases a new version of their app that has a fundamental change, then this can affect the validity of the results for that app. This may lead you and your team to make adjustments to the way tests are being run for the main app and all other apps under test to ensure results are comparable.
- Requests from clients. Similarly, after seeing some results from deliverables, there might come a request from the client to try to do some things differently, either for comparison or because of preference for a specific approach. For example, a client may request that you change the way tests are being done/evaluated/validated and/or what tools should be used. In such cases, you will need to adjust your tests accordingly.
Challenges specific to audio and video testing
In regards to test maintenance challenges related to audio and video testing specifically, there are several that you should be aware of:
Audio and video challenges
Physical device maintenance. In the event of broken/overused audio or video capture devices, cables, or even test devices themselves, you will need to detect and debug issues that may affect the measurement quality. Even seemingly minor problems, such as the video sender device being slightly misaligned or the receiver device volume being slightly raised, may leave a noticeable impact on the test results. You will also need to make sure that there are no loose cables that could interfere with a signal or connection, and therefore affect measurements.
Different network conditions. Audio and video quality testing often requires QA teams to test the application under different network conditions or limitations to see how it behaves under limited resources from the network side. Sometimes the behavior is different from usual—for example, new pop-ups may appear in the application, users may join the call with a big delay, or they might entirely disconnect from the call at inconsistent times during the test. In this case, extra investigative tests will need to be made and automation approaches will need to be adjusted. You should discuss how to handle these types of situations with your team, especially when gathering result metrics and figuring out how to present them.
Investigating outlier data. Similarly to network recording, performance metrics such as RAM, CPU, and GPU usage are also often gathered in audio and video testing. Here the test maintenance part comes in the form of investigating outlier data after updates by running extra tests and diving deeper into the data to find out the causes.
Test automation maintenance remains a fundamental part of software testing and the effort required to keep automated tests up to date should not be underestimated, especially in audio and video quality testing. This is due to many different metrics being gathered and evaluated, in addition to the need for multiple components (both physical and software) to be integrated and work together correctly in order to gather accurate and useful data. However, with solid maintenance, validation, and documentation systems and protocols in place, it becomes that much easier to maintain the existing audio/video automation setup, gather results, and detect various issues, performance or network limits, and breaking points for specific apps.
Do you want to test the audio and video quality of your application and make sure your automated tests are effective? Work with experienced software quality engineers who will support you in your testing efforts and ensure you get actionable insights. Contact us with your project details and let’s schedule a call.