Problème d’accessibilité à des données haute qualité pour les équipes informatiques, masquage chronophage des données: la gestion des données de test par des approches traditionnelles et manuelles soulève souvent de nombreux défis. Dans ce blog, nous expliquons comment élaborer une stratégie DevOps complète de gestion des données de test (TDM) pour surmonter ces obstacles et transformer la mise en œuvre d’applications.
The financial services and insurance (FSI) sector has seen a slew of changes in recent years.
Changing customer expectations are being met by new FinTech market entrants creating fierce competition. Meanwhile, traditional and heritage firms are hamstrung by complicated legacy systems – and data that’s absolutely mission critical – racking up ever-greater maintenance costs. Compliance risks, data challenges and speed-to-market constraints are hindering their move to becoming truly digital-first businesses.
A key bottleneck to innovation for everyone, however, is getting test data to development teams when they need it. As the pace of software development increases, and as automation becomes widespread in DevOps, a comprehensive DevOps test data management (TDM) strategy has become critical.
Test data management isn’t new. Large organisations have already centralised their test data functions using some form of internal process. The efficiencies achieved, however, have given way to a whole new set of challenges created by ever-faster, iterative release cycles.
Firstly, setting up data-ready testing environments using legacy approaches is a slow, manual and high-touch process. It can take days or weeks to provision test data, partly because most organisations operate queue-based request systems.
In legacy setups, software development teams also often don’t have access to good quality test data anyway. All too often, they have to work with stale copies of production data due to the complexity of setting up a new test bed. The result is lost productivity due to time spent resolving data-related issues such as accuracy, compliance and lack of visibility.
Moreover, to enable continuous testing cycles, companies need to distribute the data to their teams effectively and efficiently. So there needs to be a way of continually provisioning that test data effectively, such as via API.
Also, processes for securing test data and enabling compliance traditionally add major operational overheads. This is essential if your data contains financial or credit card information. Yet, an end-to-end process to secure data using homegrown scripts or legacy data masking tools masking process may take an entire week, which can prolong test cycles and delay time to market.
Finally, test data storage and cloud service provider costs are rising continually. IT organisations almost always create multiple, redundant copies of test data, resulting in poor use of infrastructure. The very act of standing up dedicated test systems that will not be highly utilised is costly and inefficient. Operations teams often have to prioritise who gets access to what and when, resulting in further delays to critical projects.
IT organisations therefore need to adopt a DevOps TDM approach to make the right test data available to their project teams. That’s the only way of ensuring they will be able to test at anything like the speed of development. A comprehensive approach should seek to improve TDM in each of the following areas:
Test data speed: reducing the time to provision test data.
Data quality: ensuring developers get high-fidelity test data.
Security and compliance: maximising security without compromising agility.
Sustainable business value: reducing the costs of storing and archiving test data.
FSI companies need to streamline the process of making a copy of production data available to downstream testing environments. They need to create a path towards fast, repeatable data delivery and optimal digital experiences, featuring:
Automation – making it effortless to deliver copies of test data and providing a low-touch approach to standing up new test data environments.
Toolset integration – to improve automation and reduce handoffs between teams with key DevOps tools such as masking, subsetting, synthetic data creation and more.
API-first approaches – giving developers interfaces that are purpose-built for their needs, enabling efficient delivery, effective control, and no need for queueing.
Ease of use – from a well-orchestrated approach, eliminating wait times, enabling earlier testing in the software development lifecycle (SDLC), and so reducing costs.
Development teams need access to high-quality data that meets their needs across three dimensions:
Data age – so that data doesn’t become stale and cause costly errors, the latest production data needs to be readily available in minutes.
Data accuracy – to enable multiple datasets to be provisioned to the same point in time and simultaneously reset to quickly validate complicated testing scenarios.
Data completeness – cutting operational costs by eliminating the practice of data subsetting by provisioning fully complete test data sets that take up a fraction of the space.
Masking tools irreversibly replace sensitive data with fictitious yet realistic values. Data masking can therefore completely eliminate the risk of data breach in test environments, ensuring regulatory compliance. This is now the de facto standard for protecting test data, enabling developers to focus on accelerating production, and should offer:
End-to-end repeatability – to automate and speed up the masking process – identifying sensitive data, applying masking to that data, and then auditing the resulting test dataset.
No need for development expertise – with modern, lightweight masking tools that can be set up using a low/no-code approach without scripting or specialised development expertise.
Integrated masking and distribution – where the process is tightly coupled with a data delivery mechanism to help ensure the masked data can be delivered wherever it’s needed.
A DevOps TDM platform can enable organisations to make the most efficient use of infrastructure resources, meeting the following criteria:
Data consolidation – to reduce redundant data and curb storage costs associated with legacy approaches.
Data archiving – to optimise storage use, enable fast retrieval and so enable libraries of test data to be made easily available.
Environment utilisation – to maximise returns on investment on IT through improved timesharing and the ability to leverage ephemeral test data with fast spin up and spin down.
It might seem daunting to read all these capabilities – you might be wondering where to start. Yet you can easily gain access to a comprehensive test data management platform that provides everything described here.
Delphix is the industry leader for DevOps test data management.
Businesses need to transform application delivery but struggle to balance speed with data security and compliance. Our DevOps Data Platform automates data security, while rapidly deploying test data to accelerate application releases. With Delphix, customers modernize applications, adopt multi-cloud, achieve CI/CD, and recover from downtime events such as ransomware up to 2x faster.
Leading companies, including Choice Hotels, Banco Carrefour, and Fannie Mae, use Delphix to accelerate digital transformation and enable zero trust data management. Visit us at www.delphix.com. Follow us on LinkedIn, Twitter, and Facebook.