QA Vector Research: Delivering a Quality Process Architecture
Business leaders and their customers demand better, faster and more cost-effective software releases. However, CTOs and quality managers lack the effective means of showing their executive management the value of the QA process in delivering this. They lack agreed measures and benchmarks to demonstrate RoI.
Our ongoing research among financial institutions' QA and testing leads finds that key factors for investment in this area are improved quality of software (based on 68% of respondents in a recently published summary), speed of delivery (60%), improved customer product (59%) and cost savings (47%). That’s hardly surprising – but how do you show you have delivered? And that is especially challenging when, in reality, resources are tied up redressing infrequent and buggy releases rather than planning for active improvement. 32% of respondents to our research on test data management found that they were seeing significant bugs going into production, 20% that the regulatory or project commitments for their business were being affected, and 16% that there were severe business production incidents and outages.
A transition to agile modes of development is widely held as key to solving these problems. However, among financial institutions, adoption is patchy and fails to embrace the required cultural transformation, the human factor. Waterfall mindsets, controls and processes remain a core part of the SDLC.
This paper, the first in a series of four, examines factors contributing to the development of this situation and the main problems it causes; it then points towards solutions, framed by the QA Financial “Quality Process Architecture.”
For banks constrained by the heavy burden of maintaining compliance, structured, well-documented waterfall disciplines are hard to release in favour of an agile approach. With complex legacy processes, software and infrastructure, and an organisational culture bound closely to waterfall practices, well-founded caution as well as practical obstacles inhibit or even block transition. Further, with years of critical, evolving and layered investment bound up in legacy systems, reorganisation of data into new formats compatible with agile is an intensive, costly and risky challenge. The problem may well be exacerbated by the starting point: non-production data, is often an existing underfunded point of pain.
In addition to falling behind more agile competition, the exclusive dependency on waterfall methodology perpetuates existing and drives new problems across organisations as demands for extended and integrated capabilities increase. The siloed and structured nature of such
development, with its walls between processes and business-line and activity specific staff allocations, mirrors closely the siloed nature of most financial institutions. The maintenance of siloed operations and information leads to costly delays if critical errors, specification-drift or resource deficits block progress.
With limited communication and competition between teams, and the use of stub-integrations rather than end-to-end real-life-like development and test environments, QA leave critical unintended gaps in the proving process. This is underpinned by non-compliant, insufficient, or inaccurate non-production data, as we found in our recent report on test data management.
With increasing digital competition, the move to a more agile method of operation is a business imperative that promises software and business improvements. FinTechs and e-commerce firms, not bound by same level of regulation, unlikely to hold so much critical, personal consumer data, and free from myriad legacy systems and working practices have the advantage. And while the image of banks is crucially affected by perceived security, affecting their business in the public, private and regulatory spheres, breaches in the applications of FinTechs and e-commerce do not hold the same consequences.
When software development moves to an agile world, QA is typically unleashed from a strict place and closed room but, rather, permeats the entire SDLC. Developers, QA specialists and business managers work together to build in quality framework across the process. In smaller, disaggregated teams, communication barriers are replaced with continuous communication, empowering individuals to test for and resolve problems quickly as they arise. Roles must be transformed, as developers are trained to test their own code and use automated tools by a quality lead. Integrated throughout, testing is more accurate, resources and knowledge are better shared, and significant roadblocks are removed allowing competitive releases of software products.
Yet, agile is by no means a panacea for all problems arising in software development. Without quality designed into the process, with appropriate gateways for key organisational, compliance and non-functional testing, it may lead to teams putting bad software into production faster.
For banks seeking to benefit from agility but keep important benefits and controls of waterfall, a structured transition plan is required reaching to the heart of the business organisation and involving IT, QA and business. Proper quality design must be recognised as a key factor in the success of agile adoption, and as a significant blocker if it remains unrecognised, underfunded, and neglected. This requires integrating much that is traditionally specialist QA work into development, resulting in fewer designated testers and more software engineers trained to test their own code and exploit automation tools, empowered by a QA lead. Only limited functions will remain the locked domain of QA. Quality must, therefore, be considered and play a role beginning at planning and continue throughout the SDLC.
In a recent conversation with Aston Martin Red Bull Racing we discussed their productive method of dealing with QA in an agile way with the controls of waterfall. For Formula 1, software development needs to be relatively rapid, but new software must only go into production with exceptionally low expectation of failure in the critical windows, especially the two-hour race window. However, at a certain point the software is "thrown over the wall" into an entirely ring-fenced QA lab and will not be put into production until the head of QA certifies it. Thus, Red Bull leverages agile but has a critical gating waterfall process before implementation. One of the key measures the team uses to benchmark the effectiveness of this approach is the number of bugs found in the software that reaches the QA lab, as well as the number that make it to production.
The answer is not waterfall or agile, but hybrid agile-waterfall. How do we structure an appropriate, compliant and productive hybrid agile-waterfall culture?
In the transition from pure waterfall to a more agile methodology of project management, we see three dominant challenges: planning, culture change, and infrastructure setup. We need a new paradigmatic process.
In the coming papers we will set out the infrastructure and planning required for a Quality Process Architecture that permeates the SDLC; propose a model for a hybrid agile-waterfall schema, describing the management and cultural approach needed; and finally, we will discuss benchmarking success, suggesting appropriate measures you should be looking at.
It is these benchmarks that will provide the readily communicated measures of success that will permit quality managers to demonstrate value to the businesses, and controllers of budgets, that they service.