July 1, 2020

Here are our top ten tips for avoiding common pitfalls when automating performance scripts:‍


Always attempt to validate that the automation is where it should be. If it is not, then try to recover the automation back to the starting point.


Logon and logoff processing should occur only once, unless there is a specific requirement. As the script iterates, the automation should finish an iteration at the point it started.


Scripts should be well commented, detailing the purpose for the script as well as the individual steps.


Response times for key actions should be collected.


Wherever possible, data should be random as much as possible, e.g. postcodes entered should vary. Where a range of data is used, these can be stored in external files, which are accessed by the automation. Navigation through a function should also vary, e.g. locating a customer based on name as well as postcode.


Repetitive actions should be placed inside loops.


Large blocks of code that are placed inside nested if statements, or are repeated in the automation can be placed in a subroutine that is called. This aids the readability and maintenance of the automation.


Some variables entered need to vary so that different parts of a database table are accessed. Data like this can be placed in parameter files so that different values are used for each iteration of the business flow.


User think time should be placed at appropriate points inside the automation to simulate the way a real user keys a function.


Naming and other standards should be in place to assist readability of the automation, and to help the automation work together when several people are involved in coding the scripts.

Posted on:

July 1, 2020


Automated testing


Is there a project You'd like to discuss?

related insights

Artificial Intelligence (AI) and Machine Learning (ML) in Performance Testing

The Differences between Usability and Accessibility Testing

Why Incorporate Non-Functional Testing Early in the Software Development Cycle ?

Benefits / Drawbacks of Performance Testing in Test / Scaled Down Environments

Incorporating Performance Testing within CI/CD Pipelines

How to Obtain Stakeholder Buy-In for Non-Functional Testing

Troubleshooting Performance Issues in Test Environments: A Real-World Scenario

Demystifying Database Tuning - Unraveling Database Performance

‍Functional Test Automation: Why companies often feel let down by the outcome of their investment

The OWASP Top Ten - The Top 10 Web Application Security Risks

Avoiding Artificial Bottlenecks / Performance Issues in Performance Testing

Accessibility Guidelines - Understanding WCAG 2.1, the Upcoming WCAG 2.2 and Future WCAG 3.0 Updates

What is Volumetric Analysis ?

The Performance Testing Cycle Explained

Service Level Agreements vs. Non-Functional Requirements for Performance Testing

Applying Automated Test Solutions

Combining Performance Testing and Chaos Engineering

Non-Functional Testing Strategy for Performance

Explaining Penetration Testing

Explaining Performance Testing

Explaining Accessibility Testing

Silk Central Upgrade - "It's just a simple upgrade...."

Virtual Machine LoadRunner Load Generators on Azure Setup

How Selenium WebDriver can be used for Performance Testing

Performance Testing with SSO, OAuth

16 Tips Before You Automate

What is Automated Software Testing?

Load Testing and Performance Testing Tools