Legacy approaches to test data management can be a huge blocker for modern QA practices that promise higher quality and faster releases. Hear what testers have to say about why test data is holding QA teams back.
If you hoard test environments, you’re not alone—we see it all the time, and we get it.
It happens for several reasons. When testing is destructive and changes the data, it typically means QA stashes away multiple environments to test numerous configurations and database versions, so they can get everything back to its previous state for another test run. It’s also easier to have an old environment out there, seemingly for almost every testing occasion, to get the test run done within the demands of a compressed project cycle.
With so many parallel development cycles, piles of test environments can quickly stack up with old data and different versions of schemas, quickly eating up storage budgets with inefficient redundant copies of databases (we’re talking 10X+ data duplication in many cases).
Environment proliferation not only drains QA productivity, but it also heightens the risk of a data breach with old environments and databases out there. Additionally, the actual data that many teams test with is oftentimes subpar. It’s either not realistic enough to reflect what customers are using or in other cases, missing entire fields of data.
QA teams also hesitate refreshing their environments with the latest data because it typically takes days or weeks to reprovision them. The delays are generally due to: 1) reluctance to impact production; 2) reliance on other teams due to a slow-moving, request-fulfill process and/or 3) challenges that come with the mechanics of copying, masking and importing data.
Long refresh times impact QA as well by reducing its relevance in tight turnaround situations. For instance, customer hotfixes may oftentimes be delivered with scanty testing due to a narrow window for delivery, sometimes in less than 24 hours, and challenges that involve engaging multiple teams, applications, and release versions.
But the result is always the same—old data compromises testing quality.
This is why it’s more important than ever for testing teams to take a smarter approach to test data management (TDM) using techniques, such as data virtualization and a self-service portal, to refresh data faster, cut storage costs and manage data environments more easily.
For example, a leading UK-based retail bank wanted to accelerate their pace of innovation. They were the first in its industry to give customers the ability to open up a paperless account in 20 minutes. To sustain its momentum and support growth plans, the company was looking to reduce development times and deliver new features and improvements continuously.
However, data provisioning created a significant bottleneck because its testing and development teams ran over 30 projects in parallel with more than 100 environments. Each environment required multiple data refreshes, causing the DevTest to spend a significant amount of time provisioning data.
By deploying an innovative platform that aligns data management to modern DevOps and cloud infrastructure tooling, the bank was able to dramatically accelerate project delivery, cutting data provisioning times by 80 percent. Dev and testing teams were able to branch and version data and deliver data to teams in a fraction of the time.
What’s the takeaway? QA needs the ability to provision the environments they need, on-demand in minutes. A self-service portal can cut fresh data across multiple environments and provision synchronized copies to non-production environments, automating the whole process from database initialization to configuration and validation - so that it can all be set-up in minutes!
Read more about how the Delphix Data Platform can make data fast and secure for access across your organization.