Agile is the name of the game when it comes to project delivery these days. No surprise then that the ‘Definition of Done’ (DoD) is, today, extremely crucial for a high-performing scrum team. The reason behind this is quite simple – DoD deals with the list of key activities required to produce a successful feature, a sprint, or a release. However, DoD must not limit itself to the delivery of a potentially shippable product. Instead, it should be orthogonal to user acceptance criteria from both the functional and non-functional fronts. The DoD in an agile project should include activities related to the introduction and completion of performance testing within a sprint, rather than following big bang approaches which may delay time-to-market for products.
If the DoD includes performance testing, the testing needs to be carried out only during the last few days of the sprint cycle. The challenges faced thereafter include:
So how does one introduce performance testing at the very outset of a sprint? Soon after coding starts, so to say. The answer lies in adopting a comprehensive approach towards the process. Once the functional testing of the newly developed features is over, testing for performance and/or response times should commence. The performance workload models should be derived much ahead of this, and should be imbibed into unit performance testing to get the maximum throughput. Component performance testing, along with stubbing downstream applications, helps here. Doing so assists in understanding the amount of time it has taken to carry out a given functionality. And then there’s this other consideration – as the newly developed functionalities get integrated with the overall system, performance testing should be conducted to ensure that the new features do not introduce any performance bottlenecks and/or surprises in the overall system behavior. Most importantly, conducting performance tests during the regression phase helps in identifying issues in configuration, sizing of hardware and infrastructure.
From the agile perspective, it is imperative that the performance tester starts reviewing the high-level user stories created for functional testing. Performance deals with workload models and clickstream usages and simultaneously formulates the requisite performance test strategy during the planning phase. More so because this is the phase during which teams prioritize the list of features that need to be delivered for the current sprint. It is important that performance testers work closely with product owners to drive the Performance Acceptance Criteria for selected user stories in terms of response times while assessing performance at the code/feature/ integrated system level. Not to forget that this needs to be performed in parallel within the sprint. Also, performance testers should start assisting teams by providing necessary recommendations on best performance practices that need to be adhered to during designing and code tuning optimization. Test organizations, as well as performance testers, should give considered thought to service-level objectives, test data preparation, and reusable/shared testing assets.
Adopting the agile approach to performance testing is not too difficult when the following delivery best practices are abided by:
“
Suffices to say that performance testing in an agile project environment allows the flexible management of testing. It allows revisiting the project vision and reprioritizing tasks based on the value they add to the performance test at any given point in time.
Key considerations to be included in agile performance testing include: