Join the Community

20,982
Expert opinions
43,778
Total members
302
New members (last 30 days)
109
New opinions (last 30 days)
28,267
Total comments

Where next for banks on their rock-strewn cloud journey?

Be the first to comment 1

The banking sector was not quick to take the cloud model to its heart. A combination of heavy investment in legacy systems and an in-built cultural reticence around letting go of on-premises control may have been to blame. But those early faltering phases on the cloud journey are now over, largely due to a change in the risk appetite from CISO’s. 

The latest What’s Going On In Banking study from Cornerstone Advisors suggests that around two-thirds of US banks and credit unions now have applications running in the cloud. A comparable story is no doubt playing out in other parts of the world as financial services players look to modernise infrastructure and digitise processes to bring them into line with a more automated and on-demand global economy.

The problem has shifted from debating how and when the move should be made to assessing the actual measurable value of what has been achieved so far. It is fair to say there is a degree of uncertainty here, perhaps amounting to outright scepticism, about the propaganda of the past few years that says all compute belongs on a public cloud platform. Cloud has been touted as the answer to every problem. There are many banking CIOs (and CFOs!) that would beg to differ.

We are now seeing a counter-offensive from the on-prem community seeking to capitalise on these doubts. Non-SaaS vendors are busily pointing to the uncomfortable truth that compute can in some circumstances cost more to run in the cloud than on your own servers.  Indeed one of the parallel strategies for most incumbent financial organisations has been to target cost reduction of their on premises environments, particularly mainframe estate, through improved operational set ups with virtualised workloads and the use of Linux. This combined with an investment in the remaining on premises infrastructure with upgrades, a growth of data mesh/fabric strategies and application modernisation, has meant that a significant shift in workload execution options has occurred – just when most internal business cases signed three years ago for Cloud execution (cost or refactoring to Cloud Infrastructure) are failing to meet expectations in some cases.

The most rational approach, and one that is dawning on banks, is surely a hybrid mix of on-prem and cloud. Banks should stop listening to evangelists of a cloud-only world, and accept that such a model is dead in the water. It’s time instead to move on from the whole infrastructure debate and start thinking about what really matters: workflow execution, workloads and developing applications that deliver the goods for customers. 

Those banks that have seen the hybrid light are already enjoying a much more elastic and highly optimised compute capability. This elasticity is giving them the opportunity to accelerate new areas like digital twinning and next gen customer analytics. In short, it gives them choices based on efficiency, urgency or security in place of rigidity.

The hybrid business case is much more granular and driven by needs, and results in a faster ROI. It gives you the controls you need to scale as you wish, bursting to meet sudden needs with additional cloud power while keeping certain essentials in-house. It certainly helps to rein in costs too. You’re not writing a massive cheque to a hyperscale provider for the month when you forgot to turn some dormant service off. 

With added elasticity comes the chance to start looking to the future rather than just battle today’s difficulties. A hybridised approach gives banks the flexibility, for example, to enter the decentralised finance (DeFi) market with its low values and high volumes. Cost controls really matter here with the need to dial compute up or down according to the job in hand. What was once a threat from digital rivals becomes a playground of growth and opportunity. 

But a hybrid approach to compute power will not enable this sort of opportunity unless it is backed by the right approaches. Banks have been using high performance computing (HPC) in-house for decades for risk analysis, market positioning, pricing and customer engagement across various asset classes, from FX to equities. But traditional usage of HPC is hitting performance ceilings in use cases like intraday pricing and overnight credit risk. The problem is only going to get worse. All the indicators are that HPC usage is only going to grow over the next five years, thanks to enterprise operational AI, and the need to enhance the customer experience while doing better with fraud detection. Regulations like BASEL IV (FRTB) also loom with its inherent demands and increased frequency of credit, market and operational risk reporting.

Tomorrow’s data access and data speed challenges within a hybridised environment require solutions to match. Otherwise you risk just replacing your legacy data silos with new digital ones. Banks looking for a way to proof themselves against future eventualities require a flexible architecture to match. They need this to come in at a predictable cost and align with tomorrow’s growth plans. 

Many are now discovering that this can be achieved with the right GRID Platform with enhanced GRID analytics. GRID monitoring gives banks the power to execute their transformational strategies in a way that a dedicated ‘cloud only’ approach never could. GRID gives them the analytical power they desire to stay smart in a digitised world where data is king. 

 

 

External

This content is provided by an external author without editing by Finextra. It expresses the views and opinions of the author.

Join the Community

20,982
Expert opinions
43,778
Total members
302
New members (last 30 days)
109
New opinions (last 30 days)
28,267
Total comments