QA Financial Forum New York | 15 May 2024 | BOOK TICKETS
Search
Close this search box.

Cobol Mainframes to GDPR: How financial services institutions are coping with the pace of change

shashi-kiran-headshot-photo-1569487486

Recently, I had occasion to address over a hundred executives from banks, NBFCs, hedge funds and other financial services professionals in the UK at the QA Financial Forum London 2018. Several notable institutions from Europe and elsewhere were present, including Barclays, Lloyds, ABN Amro, UBS, Credit Suisse, BoA, Bloomberg, BNP Paribas, Goldman Sachs to name a few, including hedge funds and investment banks. Those attending focused on several areas including compliance and risk management, quality assurance, technology design and planning as well as those with broader architectural responsibilities. To state that the financial services industry is seeing tremendous change is an understatement. There are headwinds of the fintech disruption that we hear about daily as well as regulatory and compliance requirements that continue to tighten. Customer expectations continue to grow and managing them grow tougher every day. But these institutions are not sitting idle. There is a definite sense of urgency to transform both business and operational models with a view towards both the topline and bottom line. Consequently digitization initiatives are top of mind as technology is used both to enable and fight disruption. I spoke on the value of optimizing automation principles as a pivot to balance speed of digitization efforts while managing risk. There were some pointed questions from the audience, as they are embracing automation in various forms and taking the plunge into DevOps and Cloud models. Subsequent conversations with attendees as well as other speakers were very insightful. Here are five nuggets I thought were worth sharing

1. Data is the new oil, gold and everything else valuable put together. Ensuring complete data privacy is still a struggle. Data storage is also very distributed and complex to manage.

Data is a double edge sword. On one hand, these financial institutions are privy to a lot of personal and financial data, and on the other have a tremendous responsibility towards ensuring privacy and security. Handling data in fast paced dynamic environments is a challenge, as is ensuring it is ‘touched’ appropriately even for mundane testing operations. Accountability is difficult, particularly when public cloud is adopted, or when external vendors are in the picture. Regulatory requirements like General Data Protection Regulation (GDPR) have brought about significant overhead, without a lot of resource allocation to deal with it. There is recognition that crypto-currencies will intersect mainstream very rapidly, and the backend operational systems and regulations are very muddied to accommodate. Data storage is also quite distributed, non-uniform and on all forms of media. It is hard to know what is where and there is active recognition that leakages could more readily occur from data in backup drives than someone hacking into a production server. Managing this is a complex task. Equifax and others are poster children for these.

2. Software investment is growing exponentially and software engineering headcount has grown 2-3X over the past few years. DevOps and Agile practices are being rapidly adopted along with automation across pockets. Software lifecycle automation is ideal, but still fraught with potholes.

The popular adage that every company is a software company and software is eating the world is exemplified by one simple metric – headcount. These institutions have hired more software engineers than any other function within their organizations, whether it be locally or in offshore locations. Making these software dev/test engineers productive and aligned with business transformation initiatives means adopting AGILE methodologies (most didn’t care for these names) and are also instituting DevOps practices. This is bringing shifts in organizational practices and influencing culture. While several execution elements have been automated, these aren’t necessarily standardized across distributed organizations. Fragmented tool adoption across different teams contributes to more inefficiencies. Building complex dev/test environments at scale in an expedited manner was still cumbersome, though not everyone felt the pain yet.

3. (Public) Cloud is still a risk, but they’re ready to take the plunge where regulations permit. Application migration will be selective. Mainframes running Cobol are still humming away!

One of the executives mentioned that he had finally secured the budget to go to the cloud, and wanted to “put something up there on AWS and see how things go and run things full steam till they shut it down”. There is considerable in-house intellectual property that would take considerable time to replicate in the context of cloud. Different institutions are at various parts of their journey to the (public) cloud. They see hybrid deployment models, but don’t necessarily see hybrid workloads for the same applications, nor any application being multi-cloud. They also don’t think it is possible to lift-n-shift every application. Nor do they want to. For one, it isn’t necessary – several are mainframes that are still reliably running Cobol applications or otherwise and they really don’t want to touch them, but have them be front-ended by something else that resides in the (public) cloud. Second it would be quite expensive to do so. It is a fact that several of these Fortune 500 institutions have invested in a lot of technology over the past few decades and kept most of it even as they have added new elements to the stack. We cannot dismiss these outright as legacy, as they’re serving critical functions in many cases.

4. “Non-core” functions that were previously outsourced are becoming strategic and vendor management in the new era has considerable challenges

Several financial institutions have outsourced or off-shored many aspects of their software development, software maintenance, quality assurance and other non-core execution elements. This has led to distributed teams and allowed them to gain cost efficiencies. Vendors like Infosys, Capgemini, Accenture, Wipro and others have gained from this trend and built up commendable FSI practices. However, there are overheads to deal with as well. Many of these offshore vendors have good domain knowledge but poor business context. This has implications while testing, security or quality assurance. Since solution architects were most often co-located with the business and had the business contexts, they were most often the go-to-people to ensure that nothing was “lost in translation”. An idea that appealed to many was to have the solution architects design the environment blueprints keeping the business context in mind and publish these into self-service catalogs to be used by distributed dev/test teams worldwide. A few of our global customers in the FSI sector as well as large telcos are doing this. This ensures standardization across teams leveraging a non-disruptive workflow while ensuring fewer elements are mis-understood. Productivity definitely increases.

5. Security and compliance are like the sword of Damocles – but several black holes persist in the organization. Managing certification and compliance with visibility isn’t easy.

I could write a book about this. I deliberately put this last, else I would end up writing a lot about this upfront. In reality, this is always an issue for the financial service sector. But especially in Europe, with the GDPR deadlines looming about, it is certainly top of mind to a lot of people. And it is not something they are thrilled about. Most view it as a necessary evil, but are pragmatic to accept that no regulation by itself will make things fool proof. I see the same posture adopted by several US-based companies that are doing business in Europe that have dedicated significant resources to get ahead of this initiative. Regulations, aside, security also is top of mind. Without fancy names like DevSecOps or continuous security, these institutions have to put the processes in place to go through the security checklist. However, the sheer velocity of code changes, as well as environment complexity has certainly made even simple tasks like applying a software patch pretty cumbersome. Whether it be certifying infrastructure, or applications (or both), the system is under pressure to do so at the same pace as before. Standing up to auditors is another issue. Sometimes you cannot protect what you don’t know. And there is certainly a lot of “legacy”. They do hire consultants and auditors to certify and provide reports. However, these dynamic environments are fast changing and it is quite a challenge to ensure that even something that was recently audited isn’t compromised. How do you perform quality assurance or security assurance in such dynamic environments? Many of them are employing automation in here as well, bringing continuous security assurance alongwith principles of continuous testing. Cyber ranges are getting attention due to the fact they help train employees and contractors in best practices in authentic environments. Money as we know it, and our perception of it is itself changing rapidly. The currencies of yesteryears could end up just becoming commodity metals or plain paper. What then can these noble banking institutions – the gatekeepers and citadels of money – do to make themselves relevant? While there will be some dinosaurs, my money is that most of these institutions will be successful in their transformation and will come out ahead. Shashi Kiran, Chief Marketing Officer at Quali