Blog

Driving at the limits – pristine data for compliance reporting

Thumbnail
In the last blog we discussed how to deliver complete and current data to reporting applications to establish a faster compliance journey.

In the last blog we discussed how to deliver complete and current data to reporting applications to establish a faster compliance journey. However, obtaining data is only part of the problem, maintaining data quality is just as important. Compliance deadlines are often very short and more than ever, banks are being asked to run ad hoc reports by the ECB or FCA. This can result in a trade-off between delivery times and reporting and data quality.

This blog discusses the importance of Data Virtualization in its ability to refresh and reset datasets at will, avoiding data becoming stale which compromises reporting quality and risks the bank missing regulatory deadlines.

Data degradation

As discussed in previous blogs Data Virtualization can provision data from various, disparate systems in minutes. This reduces the time and cost to build environments for reporting applications and ultimately enables more testing and reporting.

Although some banks can now access the full range of data needed for testing, they still need to achieve consistency. A small change in a reporting application can take weeks to complete. Even though a bank could have good quality data, over time it naturally decays. This is because as testing is done on the data, the data is often manipulated and changed requiring data to be reset in order to repeat the tests.

Due to the data refresh and reset times taking so long, banks are forced to continue with data quality gradually getting worse throughout the process. The longer the process goes on, the more it gets out of sync with production data and more errors will be created.

The answer for many banks is to throw more resources at it, whether people or infrastructure. Yet more often than not IT teams are being asked to do extra testing with fewer resources. One in three European Banks has told Delphix they have this need during their compliance journeys.

The Need for Self-Service

To overcome these problems, data agility needs to be more than an operational function. Developers, testers and analysts need to have self-service capabilities so they can refresh and reset data at will. Data Virtualisation does exactly this, allowing testers and analysts to have their own individual copies of real-time and archive data, which they are able to refresh or rewind at the click of a button.

Since Data Virtualization doesn't use physical copies or take up additional human resources to do the work, it reduces IT spend whilst speeding up the compliance journey. Additionally, advanced features such as the ability to bookmark and branch data gives testers supreme control of data, identify errors much faster and deliver quality reporting.

This is our final blog in the series, to read more about how a large European bank handled the testing requirements from the Dodd-Frank and European Market Infrastructure Regulation (EMIR), click here to download this eBook.