Banking IT Innovation and Competition Means “Good Enough” is Not Enough for Modern Banking APIs
That open banking thing: done and dusted, right? You sorted that a couple of years ago—we did all that API stuff, just like the challenger banks do, and moved on.
Sorry to be the bearer of bad news, but unfortunately, that work isn’t complete yet. You may have had a good first stab at it, but going back and doing it properly is crucial and once done will set you up for more interesting technical work.
Open banking—which started, really, as a local Bank of England push to promote more competition in the financial services market—has totally changed the banking market.
The UK’s nine largest banks and building societies are now making customer data available through open banking, and the whole idea of ‘open finance’ – (extending open banking-like data sharing and third-party access to more financial sectors and products) is on the horizon.
Data on the Open Banking site shows how dynamic this fledgling market is. We’re already up to version 3.1.11 of the Standard, so it’s hard to argue with the claim that this UK-initiated standard has fired up a fintech renaissance in the UK, and inspired similar innovation all over the world!
The raw sex appeal of the API for the customer
According to the Open Banking Implementation Entity, use of APIs as the lingua franca of finance in the 202s opens secure connections between apps and banks to enable innovation across financial services and e-commerce. This means that customers are starting to gain better control of their transaction data and benefit from low friction, high security, financial interactions. An attractive combination.
But as said earlier many incumbent banks feel they have already addressed this issue and it doesn’t need revisiting (or spending more budget on).
I’ve noticed that none of the many challenger banks (or the newer payments and digital financial services firms picking up the open banking ball and running with it) are even contemplating backfill. To them, working with the cloud has been happening since day one. Very few of them share the issue of the traditional banks who are trying to coax more modernity out of their legacy database and mainframe farms.
If you were around before the advent of the cloud, you had to make heritage systems somehow open to APIs. But the way most banks did it was via the IT equivalent of sticking on a plaster: you kludged back what you spent years doing, which was putting the whole banking stack together in one monolithic piece.
All these workarounds mean that you ended up with problems. These included challenges like extracting data across existing data silos and the integration of APIs with legacy and core systems. Both were big open banking problems for 65% and 60% of companies recently surveyed by Omdia (Ovum, as was).
There is also a real danger of you becoming irrelevant to SMEs, especially those run by more tech-savvy owners. For example, Mastercard tells us that 9 out of 10 SMWs accept some form of digital payment from their customers, and a similar percentage use open banking-backed methods to accept payments.
Ultimately, open banking will oblige you to break up that comfortable (but sadly dated) vertical and tightly-coupled stack. Mostly because it was anti-competitive and designed that way! You wanted to eat the other guys’ profit margins and have it all in-house and go to market ready.
‘Sticking plaster approach’
Basically, you can have the best API sticking plaster in the world and bodge it so enough of your interfaces are exposed to other parties to deliver more or less what the regulation obliges you to. But, unless you deal with the underlying mess, you are never going to be as nimble as the new players in the market.
Don’t take the daunting prospect of a clean-up as an excuse to do nothing. True database modernisation is the way forward and no mountain is too high to climb. I know this first-hand because many key players in the City are contemplating (or in the process of) completely reengineering their data layer to ensure they are well-positioned for success in the years ahead.
Yes, it’s risky – there will be some rocking of the boat, and moving data from where it’s lived in for years is rarely straightforward. But, there may be more risk in relying on a plaster to fix this rather than re-building it correctly, to stand the test of time. Even better, the nature of this new world is that you can do it incrementally and avoid the dreaded ‘simultaneous heart and lung bypass’ that deterred previous generations from trivial back-office bank re-engineering. This means there is a compelling reason, and a less painful process to do it, and convince an increasingly sceptical younger generation you are serious about giving them decent service.
More and more banking CIOs tell me that their current data layer and database is struggling. In some cases, it’s actually coming apart at the seams, and is certainly not fit for the future.
Overdue infrastructure modernisation happened 15 years ago with the movement from purely dedicated machines on-premise to the cloud. Application modernisation in banking happened five years ago with the rise of containers, Kubernetes and Docker. It’s now time for the data layer to be updated and enhanced.
I am convinced that open banking will be the catalyst for data layer change. Cloud-native distributed SQL is an ideal database technology choice for companies who want to update and future-proof their data layer, as users of YugabyteDB are finding.
Could you avoid doing this extra work? If you feel you’ve “done enough” with APIs for now, fair enough. But you know in your heart of hearts that it’s less agile, less flexible, and definitely less price-performant than a data layer engineered to make the most of the cloud.
And so, I fear, do your customers and partners. This means that it’s time to start considering APIs as less of a chore, and more as a really useful and positive step towards data layer modernisation and business success.