Agile is the name of the game when it comes to project delivery these days. No surprise then that the ‘Definition of Done’ (DoD) is, today, extremely crucial for a high-performing scrum team. The reason behind this is quite simple – DoD deals with the list of key activities required to produce a successful feature, a sprint, or a release. However, DoD must not limit itself to the delivery of a potentially shippable product. Instead, it should be orthogonal to user acceptance criteria from both the functional and non-functional fronts. The DoD in an agile project should include activities related to the introduction and completion of performance testing within a sprint, rather than following big bang approaches which may delay time-to-market for products.
If the DoD includes performance testing, the testing needs to be carried out only during the last few days of the sprint cycle. The challenges faced thereafter include:
- Performance testing is concentrated at the module level only; conducting it at the system level is missed which integrates all the modules
- Readiness level of the test suite is debatable if the development is ongoing
- Group’s velocity gets throttled since there are multiple tasks to be accommodated in a sprint
- Development activity ideally stops before the sprint ends – would sound like a traditional waterfall?
So how does one introduce performance testing at the very outset of a sprint? Soon after coding starts, so to say. The answer lies in adopting a comprehensive approach towards the process. Once the functional testing of the newly developed features is over, testing for performance and/or response times should commence. The performance workload models should be derived much ahead of this, and should be imbibed into unit performance testing to get the maximum throughput. Component performance testing, along with stubbing downstream applications, helps here. Doing so assists in understanding the amount of time it has taken to carry out a given functionality. And then there’s this other consideration – as the newly developed functionalities get integrated with the overall system, performance testing should be conducted to ensure that the new features do not introduce any performance bottlenecks and/or surprises in the overall system behavior. Most importantly, conducting performance tests during the regression phase helps in identifying issues in configuration, sizing of hardware and infrastructure.
From the agile perspective, it is imperative that the performance tester starts reviewing the high-level user stories created for functional testing. Performance deals with workload models and clickstream usages and simultaneously formulates the requisite performance test strategy during the planning phase. More so because this is the phase during which teams prioritize the list of features that need to be delivered for the current sprint. It is important that performance testers work closely with product owners to drive the Performance Acceptance Criteria for selected user stories in terms of response times while assessing performance at the code/feature/ integrated system level. Not to forget that this needs to be performed in parallel within the sprint. Also, performance testers should start assisting teams by providing necessary recommendations on best performance practices that need to be adhered to during designing and code tuning optimization. Test organizations, as well as performance testers, should give considered thought to service-level objectives, test data preparation, and reusable/shared testing assets.
Adopting the agile approach to performance testing is not too difficult when the following delivery best practices are abided by:
- Interact closely with end users and stakeholders to define acceptance criteria for each performance story
- Collect all performance-related requirements and address them during system architecture discussions and planning
- Form performance testing team early in the project (planning and infrastructure stages) for planning the right capacity
- Plan for performance testers to work on test cases/scripts and test data preparation, while developers code for user stories
- Involve performance testers to create stubs for external web services, downstream applications and other in-progress ecosystem apps
- Establish a system to provide continuous feedback to developers, DBAs, architects, system analysts
- Schedule performance tests for non-office hours to optimize time utilization within a sprint and include it as a step in continuous integration
Suffices to say that performance testing in an agile project environment allows the flexible management of testing. It allows revisiting the project vision and reprioritizing tasks based on the value they add to the performance test at any given point in time.
Key considerations to be included in agile performance testing include:
- Focusing performance testing on specific areas rather than on big bang approaches
- Preparing test data that may require a change in every sprint
- Trending – While Agile focuses on continuous development, the trend of continuous improvement should also be plotted
- Reusable assets – Performance test suites should try to achieve a lesser degree of change, although the payload may drastically change between sprints
- Component-level performance testing – Sprints focus on features at the outset as well as the tweak(s) that happen to internal components. A performance test suite should, therefore, be robust to validate tier-level performance
- Continuous integration – Unattended performance tests should be carried out by integrating with continuous build and integrated systems. This allows the test to be executed in an automated fashion at the end of logical check-ins and drops