Starting a new audio and video project is always exciting, however, to ensure that the project is successful and fulfills the objectives of the client, it is important to define the testing scope. Having the right testing scope will provide excellent insight into the quality of the product without going into excessive testing with large teams for long periods of time. In this blog article we will discuss how the testing scope is defined, what information is needed, how it helps estimate the test effort, and share some useful tips.
Requirement analysis and the role of stakeholders in defining the testing scope
A requirement analysis is key to understanding user expectations and how a product should work. This analysis serves as the foundation for creating test plans and defining the testing scope. The requirements usually come from stakeholders, which include development engineers, software testers, project managers, and other members of the client’s organization. It is important to note that all these stakeholders have influence on the project scope—some more, others less.
Most of the time, the requirements are in the form of discussions about the product and what the client wants to achieve with the testing activities. As a result, the main goal at this stage is to understand how to translate these discussions into specific project objectives, technical requirements and testing activities.
Sometimes, instead of lengthy discussions, clients know exactly what they want to see at the end of the project. In such cases, test leads and managers use their experience to understand what type of audio and video testing will deliver expected results. Either way, good communication between stakeholders and QA teams is crucial when it comes to gathering additional information about the technical aspects of a product and defining clear, testable, and feasible requirements for the project.
Let’s look at a sample scenario. Let’s say a client would like to test a simple audio and video calling application in which the main selling point is the ability to work in extreme network conditions and the application supports calls with up to 4 people. From this we can deduce that the application has two main audio and video functionalities—audio calls and video calls. It becomes evident that the main testing procedure would be related to network testing, with a focus on the application’s ability to work on very low or even extreme network conditions. After understanding the objective, requirements can be defined. To ensure a clear requirement description, additional information on extreme network scenarios can be specified. This will help to better understand the test scenarios and potential testing scope.
Some of the most common requirements for audio and video testing include, but are not limited to:
- The behavior of the application on different network conditions
- Audio quality assessment on different devices and device types
- Video quality assessment on different devices and device types
- Quality assessment of different content types
- Automated audio and video testing
- Subjective quality evaluation
Based on these requirements engineers can now discuss the possible test cases that would meet the requirements and help reach conclusions about the product under test. However, understanding the requirements is only the beginning. There is a whole process that derives actual test plans and test cases from the more vague requirement descriptions.
Defining the testing scope and effort estimation
For requirements to become the actual testing scope, the most frequent technique that is used in audio and video projects is a work breakdown structure (WBS), which helps to create multiple smaller tasks or, in our case, test cases from one larger requirement. The process can then be continued until we get small enough tests to be able to track progress in an efficient way.
If we look at the example scenario we discussed earlier, then we have one key requirement—observe the product’s quality on extreme network conditions. This could be broken down into observing audio call quality and video call quality on extreme network conditions, as these are the main functionalities of the application. Each of these can then be broken down into specific limitation test cases where the team would observe the quality with metrics that the client has agreed upon, like video quality metrics, audio quality metrics, and delay. A simple example of a work breakdown structure can be seen below.
After this process is complete, the team has a clearer picture of what is expected and by continuing this process at least part of the testing scope can be defined. However, to draw a full picture that would help to meet the objectives of the project in a specific timeframe and within a specific budget, the engineers have to find the golden middle. The best scope of testing is one that will provide all the vital information towards the end goal and will be possible to complete within given constraints. And this is when the testing scope becomes interlinked with effort estimation, as it takes into account the scope itself as well as all other constraints that the project has.
Effort estimation can be done in various ways, from calculating numbers in a formula to using long years of experience to understand the approximates. Either way, to estimate the effort for audio and video projects as precisely as possible, there are key factors that needs to be taken into account:
- The project scope itself. Evaluating the complexity of scenarios and their differences from the usual test cases can give insight into the additional effort that might be required. The more complex or innovative the tests are, the more effort and time might be needed to successfully fulfill the requirements.
- Type of testing. Different types of quality assessments in audio and video projects can have different timeframes that can directly impact the effort estimates. Using more objective approaches and automation for simpler test cases would take much less time than using an exploratory approach with manual testing techniques.
- Effort outside of test execution. The usual case for audio and video projects is that actual test filming and execution is only part of the whole testing process. To extract information and metrics from raw files, post-processing procedures are used which also take time and effort. The complexity of metrics also impacts the additional effort needed for test execution.
- Technical resources. Sometimes the technical resources are very broad, which will naturally take more time to test, and availability is a big aspect of what can and cannot be done. The availability of technical resources can add time to testing because there may be a need for new devices or the devices available may be older models that run slower than expected.
- Human resources. As it is possible for the size of the team to be limited, this may impact the time needed to finish all test-related tasks. Generally, team size is closely linked to time, which means that a project that has less time to complete set tasks will need more people to complete these tasks in time. Also, things like holidays and time off can also come in the way of accurate estimation.
- Risk management. Each project has its own risks that need to be considered to ensure that they will not have a negative impact on the project. When defining the risks in the project, the impact is evaluated and the overall time buffer that would be needed is calculated. This buffer is also a part of the full project timeline.
Once all this information is understood and gathered, calculating the effort estimation is easier and more transparent. The best way to understand the effort estimation is to use information from the work breakdown structure and archived projects’ estimates, depending on how accurate they were. Another effective way to estimate effort is to take into account the audio and video testing team’s past experience with similar tasks and tests. Their insight can give a realistic view of the effort and time needed to successfully complete a project.
Important planning activities
Effort estimation is usually an approximate value, so there is a possibility that the overall calculation might not have been very precise. Fortunately, there are various activities that can ensure the success of the project. In our audio and video projects we always plan ahead and use techniques to add more security and confidence in the project’s success, such as dependency management, test prioritization, and scheduling.
Understanding dependencies can ensure that testing activities do not contradict with one another and some of the possible blockers have been removed. The most common dependencies in audio and video projects are:
- Causal dependencies. These cannot be avoided because they are the nature of the project. In audio and video related tests, causal dependency exists between, for example, filming tests and processing tests. The test cannot be post-processed if it has not been filmed in the first place.
- Resource dependencies. Limited device availability and limited accounts for testing activities can create dependency in the testing process. With limited technical resources, the filming needs to be planned accordingly to ensure that there are no empty pauses in the process.
- Preferential dependencies. These are created to make sure the quality of the process is high. For audio and video projects, one of the main preferential dependencies is between test reporting and validation. The validation process is a necessity before the results are shown further.
Test prioritization is one of the most common techniques that is used to ensure a project’s success. This technique makes sure that the most important tests are completed first, which means that if the risk event occurs and the full scope is not completed within the given time frame, the most important results can still be reported on time. Test prioritization usually takes into account the main objectives of the project, the overall requirements from the client’s side, and the team’s experience with similar tests. Experience gives more information on more complex or problematic cases.
Lastly, creating and following schedules is also a key ingredient for a successful project. If the testing scope is defined with a work breakdown structure, monitoring progress and scheduling the tests can be done in a more detailed manner. Activity logs, Kanban boards and Gantt charts are all an effective way to manage the execution of the testing scope in a timely manner.
With these techniques, anomalies can be detected much faster, ensuring that less resources are spent and actions are planned accordingly.
To define an efficient audio and video testing scope, all stakeholders have to be in agreement on what the main objectives and goals for the project are. Here at TestDevLab we have extensive experience with different types of requirements—some more common and others a bit more unique—and we have learned to adapt to new technologies and ensure that our testing activities provide all the important information about the product.
Nevertheless, testing scope alone cannot show the full picture of the project. Effort estimation along with good planning practices are key to the successful execution of the testing scope. By understanding project dependencies, risks, detailed requirements, and resources, the project will run more smoothly and deliver the best possible results.
Do you have an audio and video application that could use a bit of testing? We can help. Get in touch and let’s discuss your project.