Insourcing? First, know your software
Out-sourcing has been the preferred approach to government IT project management for the past two decades, an approach driven by cost efficiencies. This is changing, in part prompted by the opportunities offered by new technologies in automated software development and quality assurance, and also by increased security and calls for accountability.
The enforcement of GDPR in June 2018 and the widely-publicised NHS data leak in July (apparently due to a coding error) have coincided with an inquiry launched in July by the UK Parliament’s Science and Technology Committees into the progress of government digitisation initiatives.
Insourcing in the UK
The Home Office and Ofsted will be among the participants in a panel discussion on the changing role of third party software vendors and the public sector, chaired by Bentz as part of the Government and Public Sector Forum.
In fact, British government bodies are now leading the world in terms of the amount of work they are insourcing, according to Paul Bentz, Director of Government and Industry Programs at the Consortium of IT and Software Quality (CISQ), the international association which promotes software quality benchmarks. Research from Global Data confirms the trend. According to the research firm, government outsourcing of IT functions hit a peak of £708m in 2012/13 but fell sharply to £535m in 2015/16.
“There can be many reasons for insourcing,” says Bentz, “But CISQ believes the first requirement is an understanding of the importance of universal benchmarks for software quality. These benchmarks help establish clear objectives and a focused approach when embarking on an organisation-wide IT transformation.”
CISQ was founded in 2010 as a joint effort between Carnegie Mellon University and Object Management Group, a not-for-profit US technology standards group. to create standards for measuring the structural quality of software. According to CISQ, these are quantitative criteria that can be used to evaluate the performance of an IT system according to four different measures: reliability, security, performance efficiency, and maintainability.
The standards were created by compiling a list of known violations of good coding practice in each category. Software can then be ranked against these criteria according to the number of violations it contains.
CISQ works with various US government and regulatory bodies, including the Securities and Exchange Commission, the Department of Defense, and the Department of Science and Technology.
Prior to joining CISQ, Bentz spent 30 years running IT operations for large financial firms including Paribas and Allianz. In 2016, he took on his current role, promoting CISQ’s standards to regulators and government bodies, with a focus on their application finance and healthcare.
The key challenge, says Bentz, comes when government organisations try to do too much too soon, for example moving IT operations in-house at the beginning of large-scale transformation projects. Assessing software quality and risk should be the first step in the process.
“If IT operations have been in the hands of a third-party SI [systems integrator] , organisations know nothing about the quality of that software,” warns Bentz. “The first thing they need to focus on when moving operations in-house, therefore, is getting to know what they’re working with. They must investigate and measure how safe, well-written and documented the software is.“
“Organisations have to know how to allocate resources and time to getting to know their software and then use that information to define the scope of the transformation.”
The Fannie Mae Story
One example of a successful IT transformation done largely in-house is the Federal National Mortgage Association or Fannie Mae in the US.
In 2015, Fannie Mae recognised it needed to transition to Agile-DevOps to keep up with the progress of its competitors. It would be a difficult change due to Fanny Mae’s sprawling ecosystem, comprising 461 apps in total.
As a first step, Fannie Mae performed a language-agnostic analysis of the entire system. It conducted automatic periodic evaluations of progress in quality, productivity, as well as delivery speed, and used automated analysis tools to detect flaws in application quality.
This allowed teams to address issues quickly. By aligning metrics across the organisation and automatically measuring productivity, the teams eventually made the improvements needed to justify an organisation-wide move to Agile-DevOps.
When undertaking such a comprehensive shift, clear lines of accountability are crucial. Often, this requires a complete cultural shift within the organisation.
“If we are contemplating a transformation, then risk is distributed throughout. A CIO might be in charge of the technical implementation, but the organisation has to take the risk of designing the blueprint – particularly within an Agile-DevOps context,” Bentz states, adding, “The culture shift that this requires means that digital internalisation and transformation initiatives are best undertaken separately and in stages.”
Paul Bentz will moderate a panel on the changing role of third party software vendors and the public sector at QA Financial’s Government and Public Sector Forum on October 3. For more information and registration, click here.