Next event:

Perfecto Webinar: Future proof your test engineering with AI

27 March, 2019
News and research on financial software quality assurance and risk management

QA Vector Research: The Right Foundations for a Quality Process Architecture

14 February 2019
The second article in our Quality Process Architecture series outlines the necessary organisation, processes and infrastructure businesses should have in place to embed quality throughout the SDLC as it evolves from waterfall to agile.

Speech delivered by Justyn Trenner, Head of QA Vector Research, at the QA Financial Forum Milan 2019 (Italian)

In our previous paper (Delivering a Quality Process Architecture), we explained some of the challenges that arise from the transition from a waterfall world to one that is more agile, and the difficulties arising from a lack of agreed benchmarks for the contribution and effectiveness of the QA process.

This paper focuses on what needs to be in place for the Quality Process Architecture to succeed; the degree of integration of processes, the infrastructure and the human or team values.

Integrate QA throughout

The quality of preparation, communication and information throughout the lifecycle determines the quality of outcomes. Consistency in these elements will determine the viability and meaningfulness of any measurement of the QA process.

Developing and maintaining this quality requires establishing an organisational structure, a QA infrastructure and non-production data. Building quality thinking into setup benefits development directly, as preparing for and ingraining a quality architecture throughout the SDLC results in a better product faster.

Such an approach can be expected to challenge a culture of business and functional silos. And the approach has to be built into individual attitudes; as one Director of DevOps at a global bank observed to us ruefully, “Quality is not built in from the get-go. Teams are just delivering bad quality into production faster. Building quality is not yet something that is done well across the organisation.”

Legacy processes and infrastructure cannot be transformed without a strategic programme of change that addresses the obstructions of culture as well as of toolsets and controls. This requires buy-in to the programme and commitment to the process from business, development and QA, and breaking down the traditional barriers between each of those. True transformation requires the investment of time, money, resources and effort to see the establishment of a resilient framework integrating quality at the heart of the organisation.

Fully integrating and considering QA throughout the SDLC is integral to success.  Quality provisioning must be resourced, budgeted and trained for. And it must be applied to the oversight, specification and integration of services provided by third parties.

Speaking at Bloomberg, Megan Butler, Financial Conduct Authority (FCA) Executive Director of Supervision – Investment, Wholesale and Specialists, raised her concerns: “Only 66% of large firms and 59% of smaller firms tell us that they understand the response and recovery plans of their third parties.”

In the last year to October, the FCA saw a 138% increase in technology outages and an additional 18% increase in cyber incidents. Therefore, the FCA is paying closer attention to the management of controls and risks, advising that boards and senior management take responsibility for technology resilience. That implies attention to quality in development, in oversight of suppliers and in ongoing provisioning and delivery of technology-reliant services (which includes just about everything).

How, then, do you set up your development organisation, process and infrastructure to embed quality?

Centralise management and deployment of infrastructure and data

Non-production infrastructure and data, like their production counterparts, are heterogeneous, with multiple owners, touchpoints and domain experts. But each component user and piece of software is dependent on the others to test and deliver successfully.

QA Vector Research on non-production data has provided many illustrations of the problems caused by local responsibility for test data. And only a virtualised end-to-end service will permit reliable testing of a newly developed component. Stub-API environments lack completeness in a manner that all too often rebounds when the solution moves to production. With centralised ownership and a clear vision over infrastructure and data provision,  production-like infrastructure and resources can be shared across teams, speeding discovery and resolution of development challenges. Similarly, clearly defined, consistent group policies for security and compliance streamline security processes and accountability, allow cross-pollination of concerns and ideas for improvement. Planning and building a robust infrastructure and communication structure will also play a role in benchmarking success and maximising RoI.

Of course, a common non-production infrastructure with test data, into which a new development component can be introduced, provides a natural bed for a range of automated non-functional tests. Pre-defined and ready to run, the datasets, required for benchmarking and reading the outcome of new tests, will be on tap.

Create an interactive network

Agile thinking requires the people who make up the organisation to jettison their paradigm of departments separated by notional walls. Rather, they need to think in terms of a so-called interactive network. This means reaching across departments and functional lines. And it cannot be imposed top-down but rather requires buy-in from everyone, including QA, to transcend long-standing boundaries. This also entails effective integration of non-IT functions from the business lines to compliance and risk controls.

We are talking here about an approach to quality and development that needs to be embedded in the mindsets of business and technologists alike, and in the processes and organisation that facilitate delivery. With change requester, coder and tester close together – and on occasion being the same person – communication is faster and more accurate, so that testing can similarly be more on point and results addressed rapidly.

Using Continuous Integration (CI), Continuous Delivery (CD), and automation tools, structures should be put in place that improve the efficacy of team integration. In the new world of DevOps, QA can no longer exist as distinct from development. Developers must be empowered to test their own code with the direction of a quality lead. By establishing lines for sharing of information and resources, teams will also be able to make use of tools such as service virtualisation more effectively, enabling continuous development and testing, and eliminating the need for some infrastructure. Embracing the development and opportunities provided by these new technologies and those emerging such as machine learning, are reliant upon successful development of a Quality Process Architecture that creates an effective architecture for relationship and solving the problems of data, legacy systems and culture that poor management, undirected fear over growing regulation and lack of investment have allowed to develop.

Non-production data on tap: the essential life-blood of development

Having non-production data ready ahead of time is core to the Quality Process Architecture. As we concluded in our research with Delphix, “standardised, centralised, on-demand, well-organised and compliant data can no longer be a nice-to-have, but rather must be a must-have.” It is a requirement, not only to create developments, but also to test them.  And the data (both clean and credibly dirty) needs to come from multiple business sources and owners. It must be maintained, updated, refreshed and ready-on-demand. Assembling it ad hoc and only when needed will inevitably lead to delays and weaknesses in the testing process.

In our research with leaders in QA and development on test data management, 35% of respondents found that issues with test data were leading to minor issues and bugs, 17% that it was impacting regulatory or project commitments for the business, and 13% that it was leading to severe business production incidenst and outages (QA Financial Research, 2018)

In our research with leaders in QA and development on test data management, 35% of respondents found that issues with test data were leading to minor issues and bugs, 17% that it was impacting regulatory or project commitments for the business, and 13% that it was leading to severe business production incidents and outages (QA Financial Research, 2018)

Data preparation facilitates execution: “People sometimes see data governance as preventative, and think that they have to circumvent it to get the job done. What they miss is that it helps, not hinders. The risks would be flagged, the solutions would be in place, rather than being reactive, teams should be proactive,” according to a senior technologist at a UK bank.

An integrated framework for data management is a powerful enabler to solving bottlenecks created by poor data management and a lack of availability. While only 19% have one in their organisation, 76% of respondents are confident such a framework will have a role to play in the future. With only 17% anonymising and continuously refreshing data against a backdrop of increasing regulatory interest in QA activities and the 2018 implementation of GDPR, consideration and provision of non-production data must be addressed in the design of the development architecture.

Setting up quality to deliver better, sooner

Regulators are paying increasing attention to financial organisations’ management of known risks and the involvement at board level in developing technology resilience. This involves creating a centralised oversight, careful pre-planning of resources, and the involvement of business, IT and QA across the SDLC. Addressing the challenges set out in this paper will not only lead to stronger, cleaner, more compliant data and infrastructure. The core culture change will enable delivery of better quality, more resilient product.

Our next and final paper in this series will propose elements of effective benchmarking to demonstrate the value added by the Quality Process Architecture.

Get the latest
by email

Newsletter Sign-Up
First Name*
Last Name*
Email*

I understand my contact details will be entered into your database and used to contact me by email with QA Financial's newsletter. See our privacy policy here.

Opted-in to receive newsletter
Source - newsletter sign-up form