August 7, 2023

One of the biggest problems of the performance tester or any type of non-functional tester is that, as opposed to functional testing, there are very rarely a set of non-functional criteria available to test against. As with any test, you need some sort of metric or result that confirms what a passed test looks like and what a failed test looks like, so where do you go for that type of information ?

Usually, Service Level Agreements (as known as SLAs) and Non-Functional Requirements (as known as NFRs) both can be used to provide criteria to assess performance testing, but they are different, so it is worth knowing the difference and how both can be applied to validate performance testing.

SERVICE LEVEL AGREEMENTS

On one side, we have Service Level Agreements. An SLA is a formal agreement between a service provider, such as a software provider and a customer which outlines to the customer the level of service that outlines the level of service the provider agrees to deliver.

SLAs are typically focused on defining the acceptable performance criteria (expressed in metrics such as response time and/or system availability) for a service or application, and as such provides an acceptable indication to the customer of how the system should perform. The SLA constitutes a measurable requirement that can be tested and marked as passed or failed accordingly.

SLA’s usually take two main forms.

a). System Availability - A typical SLA here could be ‘the application is required to be available 23.5 hours per day every day except on Sundays.'

b). Response time - A typical SLA here could be 'all user transactions must respond to the user within 2 seconds.'

Performance testers generally look towards the response time type of SLA, as the system availability SLA cannot easily be tested.

SLAs are defined from a business perspective and are usually contractual. They are meant to ensure that the service or application meets the customer's expectations in terms of performance, and failure to do so could incur penalties, compensation, or other measures. SLAs are also often set based on the end-user experience and the expectations of the customers or users of the service, and as such may be different from customer to customer.

NON-FUNCTIONAL REQUIREMENTS

On the other side, we have Non-Functional Requirements. NFRs are the characteristics or attributes that define how a system is required to behave, rather than relating to any specific functionality in the system. As such, usually non-functional requirements are not as easily derived as functional requirements and need careful consideration.

NFRs provide technical and design guidelines for achieving desired performance characteristics, which include aspects such as scalability, reliability, CPU and memory utilization, response time, concurrency, and more. They also guide the technical implementation of performance-related aspects, for example an NFR might specify that the system should handle a certain number of concurrent users without exceeding a certain response time threshold or the allowable threshold of CPU and memory utilisation.

SUMMARY

In summary, compared to SLA’s which are more customer-focused, NFRs are often more technically focused and guide the development and testing teams into designing and creating an application that meets the desired performance characteristics. They help developers and testers understand the performance expectations of the system and design it accordingly, without the contractual implications of SLAs.

The truth is that both SLAs and NFRs play crucial roles in performance testing, helping to ensure that the tested system meets the desired performance standards and user expectations.

Posted on:

August 7, 2023

in

Performance testing

category.

Is there a project You'd like to discuss?

related insights

Artificial Intelligence (AI) and Machine Learning (ML) in Performance Testing

The Differences between Usability and Accessibility Testing

Why Incorporate Non-Functional Testing Early in the Software Development Cycle ?

Benefits / Drawbacks of Performance Testing in Test / Scaled Down Environments

Incorporating Performance Testing within CI/CD Pipelines

How to Obtain Stakeholder Buy-In for Non-Functional Testing

Troubleshooting Performance Issues in Test Environments: A Real-World Scenario

Demystifying Database Tuning - Unraveling Database Performance

‍Functional Test Automation: Why companies often feel let down by the outcome of their investment

The OWASP Top Ten - The Top 10 Web Application Security Risks

Avoiding Artificial Bottlenecks / Performance Issues in Performance Testing

Accessibility Guidelines - Understanding WCAG 2.1, the Upcoming WCAG 2.2 and Future WCAG 3.0 Updates

What is Volumetric Analysis ?

The Performance Testing Cycle Explained

Applying Automated Test Solutions

Combining Performance Testing and Chaos Engineering

Non-Functional Testing Strategy for Performance

Explaining Penetration Testing

Explaining Performance Testing

Explaining Accessibility Testing

Silk Central Upgrade - "It's just a simple upgrade...."

Virtual Machine LoadRunner Load Generators on Azure Setup

How Selenium WebDriver can be used for Performance Testing

Performance Testing with SSO, OAuth

16 Tips Before You Automate

What is Automated Software Testing?

Load Testing and Performance Testing Tools

10 Top Tips for Automated Performance Scripts