ISTQB certification for performance tests

ISTQB certification for performance tests

1. February 2024 | 6 min |

Continuous development and certifications are an important part of our work in customer projects. I have completed the ISTQB certification for performance tests and in this article I will highlight the basics and how these activities are integrated into the ISTQB test process. In particular, the focus will be on the identification and creation of usage profiles and the development of load profiles. The responsiveness of the system under different loads is evaluated through the correct application of different performance test types. Finally, the analysis of the test results and reporting is considered in order to derive well-founded recommendations for the optimization of the system.

Performance testing basics

The performance of a system is a key aspect in providing users with a “good feeling”. In the ISO 25010 standard, performance is categorized in the product quality model as a non-functional quality characteristic with three sub-characteristics:

  • Time behavior: The degree to which a component or system can perform its required functions within the required response times, processing times and throughput rates.
  • Resource utilization: The degree to which resources can be used according to the quantities and types defined in the requirements when a component or system performs its functions.
  • Capacity: The degree to which the maximum limits of a component or system parameter fulfill requirements.

The prioritization of these sub-features depends on the risks identified and the needs of the various stakeholders. When analyzing the test results, further risk areas can be identified that need to be taken into account.

Performance test activities

The performance test activities are based on the phases of the ISTQB test process and are organized and carried out differently depending on the type of development model used. Regardless of the development model used, different test types can be used for performance tests. The most important are static and dynamic testing. The static test involves manual testing of the work results, whereas dynamic tests require the software to be executed.

Static and dynamic performance test

Static test activities are often more important for performance tests than for functional tests. The reason for this is that many critical performance errors are rooted in the architecture and design of the system. If performance is the decisive factor for a system, this should also be the guiding principle in the architecture.

Dynamic testing activities should be started as early as possible in the course of software development. Potential bottlenecks can be identified as early as the component stage and resource utilization can be evaluated. In the system integration test, performance tests are carried out with the entire system, which is representative of the production environment. Finally, the acceptance test is carried out to validate that the performance of the system meets the originally specified user requirements and acceptance criteria.

Identify and create usage profiles

Usage profiles define different interaction patterns with an application. They enable a repeatable, step-by-step process through the application for a specific use of the system. Aggregating these usage profiles results in a load profile (generally referred to as a scenario).

Identify and create usage profiles

First, the existing data is identified. Different types of user personas and their roles are determined. Generic tasks that are performed by these users/roles are assigned to them. Finally, the estimated number of users for each role/task per time unit is recorded. This information is particularly helpful for the subsequent creation of load profiles.

The data required for this can be collected in various ways. For example, by conducting interviews or workshops with stakeholders such as product owners, sales managers and end users. Functional specifications and requirements can be searched for intended usage patterns, user types and their usage profiles. Another source of information can be the evaluation of usage data and metrics from similar applications, from which collected monitoring data, logs and usage data can be analyzed.

Load profile

A load profile specifies the workload that a component or system under test could experience in production. The load profile consists of a certain number of instances (virtual users) that perform the actions of predefined usage profiles over a certain period of time. The following information is required to create a realistic and repeatable load profile:

  • Performance test objective (e.g. evaluation of system behavior under stress loads)
  • Usage profiles that accurately represent individual usage patterns
  • Quantity and time distribution with which the usage profiles are to be executed
    • Ramp-up: Constantly increasing load (e.g. adding one virtual user per minute)
    • Ramp-down: Continuously decreasing load
    • Stages: Instantaneous load changes (e.g. add 100 virtual users every five minutes)

Performance test types

The load profile created is then developed into a performance test. The term performance test is the generic term for any type of test that focuses on the performance (responsiveness) of the system or a component under different loads. Different types of performance tests can be defined. Each of these types can be applied to a specific project depending on the test objectives:

Performance test types

Analysis of the results and reporting

Functional tests generally benefit from well-defined test oracles, as the expected results are usually clearly defined and the test results can be interpreted unambiguously. The performance requirements form the basis for analyzing the tests. The metrics defined in advance determine what needs to be measured in the test run. The measurement data collected from the tests is then compared with the objectives of the performance tests. Typically, the following data is analyzed:

Analysis of the results and reporting

Graphical representations make it easier to view the data and recognize trends. In order to obtain meaningful metrics, various values should be set in relation to each other. For example, the response time metric can be defined more precisely with regard to the time of day, the number of simultaneous users, the amount of data to be processed, etc.

The results of the tests are first compared with the objectives of the performance tests. Once the behavior has been understood, conclusions can be drawn, resulting in a meaningful report including recommendations. For example, recommendations can be made for technical changes, such as reconfiguration of hardware, software or network infrastructure. Areas for further analysis and additional monitoring can also be identified.

An effective performance testing process is crucial to ensure that IT systems meet user requirements and expectations. If you are wondering how you can best integrate performance testing into your development process, don’t hesitate to get in touch with us.

About us

We are a powerhouse of IT specialists and support customers with digitalization. Our experts optimize modern workplace, DevOps, security, big data management and cloud solutions as well as end user support. We focus on long-term collaboration and promote the personal development of our employees. Together, we are building a future-proof powerhouse and supporting customers on their path to successful digitalization.

Contact

Do you have a request? Please contact us!

Do you have a request? Please contact us!

As your companion and powerhouse in the IT sector, we offer flexible and high-performance solutions.