Blog

Accelerate Open Banking Innovation with These 7 Data Capabilities

The inability to automate data impedes time to market with open APIs and banking features at over half of Europe’s banks, according to new research.

Sixty years before Elon Musk launched a Tesla Roadster into space, Luna 1 was the first-ever spacecraft to achieve “escape velocity.” Until then, the enormous amount of fuel needed to break away from Earth’s gravitational pull was the biggest impediment to space travel. All that fuel added so much mass that the rockets needed much more to lift off, adding to their weight problem. The data stockpiled in banking institutions is a lot like rocket fuel. Enormous amounts of sensitive customer and transaction data locked in legacy systems are essential to compete and innovate—but weigh banks down. 

With the adoption of new digital habits, consumers expect greater convenience, choice, and flexibility in their relationships with banking institutions. In the first six months of 2020, the number of users of open banking–enabled apps or products in the UK doubled, and by February 2021, it had grown to over three million. Banks need to quickly weave open APIs into their business models and deliver innovative software faster to maximize open banking value and benefits —but are struggling to speed up to keep up. Meanwhile, a constant stream of new data exacerbates the privacy, productivity, and latency risks already slowing them down. 

One Delphix customer has 30,000 applications dispersed across a vast multi-generational array of modern apps cohabiting with mainframe and client-server. That’s the scale and complexity banks have to deal with as rapid change fueled by market forces and government regulation unlocks a wave of innovation. They must harness all that data to compete in the rapidly evolving global financial services ecosystem. At the same time, Amazon, Apple, and Google have set a new pace of innovation and provide technology platforms for nimble fintechs and startups to follow suit. 

So What Exactly Causes These Hold Ups for Dev/Test Teams? 

Sixty percent of IT leaders say data protection. A new Pulse study found that the biggest challenge preventing banks from opening up their data to third parties and meeting the new regulations is the time and effort to maintain and preserve data integrity. 

delphix open banking infographic 1

 

With 90% of the data risk surface in pre-production, it’s all too easy for an insecure dataset to slide out, so data must be made safe before it is copied to non-production environments. However, InfoSec lacks visibility into data dispersed across a vast multi-generational architecture and struggles to mask it, so they must limit access to production data, curtailing innovation.  

The research also found that 29% of developers and testers rely on Ops teams to manually provision and refresh data environments—which takes an average of 4.5 days but can stretch to months. 

delphix open banking infographic 2

Back when waterfall development projects were all the rage, access to a handful of test environments refreshed once a quarter was acceptable. But today, manual processes to protect, secure, provision, and refresh data environments severely impact delivery timelines and productivity. The result is bottlenecking, which derails open API development and testing and a raft of other essential projects if banks want to compete on an even footing with new market entrants in a rapidly changing market.

All of that is compounded by 42% of dev/test teams having little option but to use synthetic or subsetted data to plug the data availability gaps. Such workarounds introduce hidden dangers, including quality and stability risks through stale data and the potential to limit test coverage severely. Open APIs accentuate the need for assured interoperability between composite applications and microservices underpinning complex processes and hand-offs between service providers, so these particular risks are especially problematic.

Data Capabilities to Accelerate Open Banking Innovatio

Here are 7 essential capabilities required in a modern enterprise data platform to accelerate API development and testing. 

  1. Ingest data from all apps. Sync data from source applications and databases, capturing a granular, continuous record of data changes from mainframe to cloud-native and everything in between. 
  2. Automate complex data operations. Manage changes to data copies over time and take advantage of API-driven data operations such as bookmarking data copies, updating virtual copies to the latest data, reverting to earlier versions, and splitting copies for sharing
  3. Assure data compliance. Find and protect sensitive data and personally identifiable information. Change it irreversibly and deliver realistic data to downstream environments to comply with data privacy laws such as GDPR and CCPA and protect financial data (PCI).
  4. Achieve near-zero data refresh times. Deliver production-like environments faster, slash processing overheads, maximize test time, improve quality and success with self-service and on-demand access to integrated data-ready environments. 
  5. Data immutability. Across source applications and virtual data environments–so users can provide precise and granular data for various use cases by time travelling data to a specific point in history.
  6. Data versioning. Create multiple, independent versions and copies of data, then easily change, revert, and update the versions of copies in use. Radically improve destructive testing, alpha/beta experiments, parallel application development, and sharing data with other users – without impacting other pre-prod environment users.
  7. Space efficiency. Deliver data-ready environments faster and more efficiently than traditional means. It consumes a fraction of the time and storage space to deliver fully read-writable virtual data copies –which can be used by dev-test and data scientists in the same way as regular physical copies– in minutes, not months.

Bringing Data Agility to Accelerate Open Banking Initiatives

Combining data compliance with on-demand delivery helps leading banks like BNP Paribas stay ahead. BNP Paribas CIO Bernard Gavgani runs one of the world’s biggest tech organizations.

He believes data is part of a company’s DNA and focuses on innovative use of it to increase productivity and performance. At the centre of his operation, one team rationalizes and secures retail and corporate payments data to keep BNPP’s instant payment innovation program and rollout on track. Combining data compliance with on-demand delivery has improved software quality, reduced downtime, and slashed the time to launch its open API marketplace. 

BNP Paribas assured data compliance for all pre-prod. They radically accelerated the delivery of environments, so development and testing teams across the globe could increase AI projects going into production three-fold and accelerate cloud adoption. 

“Our objective is to facilitate the use of data to increase productivity and performance,” Bernard Gavgani, Global CIO, BNPP said. “BNP Paribas is accelerating its digital transformation journey to build the European bank of reference.” 

To learn more, dive into the full research report, “The Future of Banking is Open and Regulated, but Few are Prepared.” 

Suggested reading

Blog

The Biggest Blocker to Open Banking Success? Slow, Risky Data

New Pulse Q&A research shows less than 5% of European banks are fully prepared for open banking.
Blog

Inside BNP Paribas' Digital Banking Innovation: Cloud, Data, AI

Featuring BNP Paribas Global CIO Bernard Gavgani
Thumbnail
Blog

Make Data Compliance Easier Across Your Enterprise

Delphix’s new extensible masking connectors and algorithms can transform data for more data sources, including MongoDB Atlas, Salesforce, HANA, and Snowflake.