By Stephen Fontanella COVID-19 accelerated the adoption of emerging technologies, from automation to machine learning, and the banking industry was no exception. Having to close their physical branches, traditional banks rushed to adopt cloud solutions to accommodate remote customers, maintain services, and introduce new ones.
The transformation of banking technology was already taking place, with peer-to-peer payment apps like Venmo, mobile check deposits and digital wallets like Apple Pay and Google Pay becoming widely accepted. In many ways, the pandemic accelerated the plans of banks to enhance relationships with customers with technology and realize the time savings to be gained through automation.
All of that means handling more data in more ways from more locations than ever before, and
89 percent of banks have confirmed implementing, or at least planning to implement, hybrid cloud strategies for business continuity and resilience. JPMorgan Chase is a prime example, allocating
$11 billion for investment in the cloud, along with artificial intelligence, big data and machine learning.
Given the cloud’s benefits of enhanced agility, efficiency and scalability, the horizon seems more promising, with
58 percent of financial services professionals foreseeing cloud usage increasing over the next year, according to the latest State of Database DevOps survey from Redgate Software. However, before diving into the cloud, banks may wish to consider four key factors to ensure a seamless migration.
Legacy infrastructure is the adversary of cloud innovation
COBOL has been a staple in the banking industry since the 1960s, with
43 percent of banks still leveraging that programming language today. COBOL is even involved in several critical financial processes, such as managing 95 percent of ATM card swipes and enabling 80 percent of in-person credit card transactions.
As an integral part of our purchases and transactions, it’s understandable why this technology remains in place after all these years. However, the demands of today’s customers are vastly different from the demands of customers from the 1960s, and COBOL-based systems lack one critical characteristic: real-time operational capability.
Nearly half of financial services professionals in Redgate’s survey—
47 percent—revealed that residual legacy code created challenges to improving software delivery, along with the risk of disruption to existing services. To make matters worse, legacy code—and by extension, legacy software like COBOL—hinders banks from upgrading systems and implementing new technologies such as cloud applications.
While the industry is seeing a resurgence in learning COBOL due to the impact of COVID-19, this is firefighting, instead of creating new ways of using it. Old COBOL programs just can’t keep up, and it’s a question of making do for now, until digital transformation initiatives such as cloud and automation come in to replace it.
Embracing compliance in the cloud
Cloud migrations soared in the global banking industry due to COVID-19 in 2020, and industry professionals expect investments to double over the next five years. Banking is a global business, and with regulations like the General Data Protection Regulation in the European Union and the California Consumer Privacy Act in the U.S., there are several compliance hurdles banks must consider in their move to the cloud.
Both the GDPR and CCPA require businesses to identify and categorize the personally identifiable information they hold to ensure sensitive data is protected, so banks need to consider how they classify and anonymize data using methods like masking when moving to the cloud.
Many databases and servers hold a bank’s data, and classification helps to quickly identify what data lives where, along with who has access to it in particular parts of the business. Using that information, banks can easily determine which data needs to be masked before it goes into the cloud, is used for development and testing, or is moved. Remember that data is not static—it is refreshed all the time, and this should be an ongoing process rather than a one-time exercise.
Investing in tools that can help banks to automatically classify data and help determine where sensitivities lie not only ensures banks remain compliant, but helps ease the transition to cloud-based solutions.
When database deployments go awry
Database deployments are crucial, and the Redgate survey shows that
51 percent of banking professionals deploy changes to their databases once per week or more compared to 47 percent across other industries. That’s due in part to their desire to provide, maintain and enhance a convenient and customized experience for their customers.
The high frequency of database deployments can also be attributed to the banking industry’s increasing shift to the cloud. To a beginner, an increasing number of database deployments is not an issue, but to a trained professional, this can result in the need for hotfixes.
As the name implies, a hotfix is an urgent fix to an error in code that has been released to the production database during a deployment. Though hotfixes work for a while, a bank having to issue several hotfixes per day could mean that it is making changes to its production databases directly, not storing those changes in version control, and that its database deployment processes are not fully mature.
In addition, banks
highlighted the following as the three major challenges facing database deployments in the survey:
- Synchronizing application and database changes.
- Overcoming different development approaches.
- Preserving and protecting business-critical data.
To address these challenges, avoid multiple hotfixes and better manage their database estates, banks should consider how they address database changes wherever that data lives. High performing IT teams typically implement version control for their database code and introduce DevOps processes for their database estates through the adoption of tooling that enables automation.
It’s about ‘who’ and ‘what’ when it comes to cloud adoption
It’s understandable that banks have adopted cloud-based solutions to navigate COVID-19’s disruption. However, the act of adopting multiple solutions, or “cloud bandwagoning,” is bound to create friction within a bank’s systems, resulting in unpredictable consequences. Instead of adopting multiple solutions for their unique capabilities, it is more prudent for banks to select one cloud provider that holistically complements their needs and systems.
In addition, there are benefits to selecting one application for cloud migration as a test instead of migrating all applications, or data, into a cloud-hosted database at once. Once complete, it is best to monitor its function within the database and determine if the provider’s solution meets the bank’s qualifications for agility, efficiency and scalability. If the qualifications are met, the bank can then carefully and incrementally shift applications into the database.
The cloud is a boon, but beta-testing and proceeding with caution will allow banks to circumvent any future errors or headaches that stem from cloud migrations.