Platform

Unleashing the Flow of Data to Drive Digital Transformation

Do you have the agility your business needs to thrive in a data-driven economy? Learn the 3 common challenges IT teams need to overcome to accelerate their data.

Alberto Sigismondi

Jan 14, 2019

One hot topic that every IT organization has on its radar is business agility. In fact, it’s one of the most important discussion points when I engage with our customers. IT organizations are dealing with a significant number of roadblocks when it comes to aligning their strategic initiatives to support business objectives. Many of these challenges typically have to do with what I define as the lack of data agility, which is an organization’s ability to efficiently provide the right data to the right people, when and where they need it, all in a fast and secure way.

For me, data agility for the business is what water is for humans - without water (data), we (organizations) wouldn’t exist. With slow and inefficient access to water (data), we’d be dehydrated, slow and weakened. Getting the most out of data begins with analyzing and eliminating the roadblocks between the data source and its consumers. Sounds easy, but evidently it isn’t.

While this article doesn’t address all potential data agility challenges, here are 3 common data-related issues that are at the origin of bigger problems down the road.

1. Inefficient processes when provisioning data to non-prod

No matter what software development methodology you use today, your primary applications need to be developed and tested to ensure quality and customer satisfaction. Test data should flow without friction to the Dev teams as needed in a secure environment that never compromises the organization or its customers. This should be a smooth process, but oftentimes, it isn’t. Here’s why.

Most organizations spend a significant amount of resources (both human and capital investments) and time to move a single database (DB) from production to non-production, and as the number and size of the databases grow, the task becomes much more difficult. For larger organizations, moving data from production to the multiple non-prod environments can be significantly harder because of complex network topologies and isolated non-prod environments on-premise, in the cloud or in hybrid IT architectures.

In worse cases, a developer has to wait weeks before he or she gets a DB refreshed, and the entire process from request to delivery will touch resources from DBAs (copy and mask data), backup admins (backups and restores), storage admins (snapshotting, cloning, provisioning), network admins as well as operations personnel (approvals, process orchestration). This process is extremely inefficient and costly, preventing the entire organization from keeping up with market demands.

Not to mention organizations have highly-paid, skilled talent forced to spend valuable time managing data provisioning requests rather than providing real value. The worst-case scenario is losing those talented employees and replacing them with lower-skilled personnel to fulfill the inefficient processes, which consequently represents a decrease in the value of human capital.

But it doesn't have to be this way. Taking a platform-based approach that integrates with your existing DevOps tools and workflows can make it possible to stand up a complete development environment in minutes and automate testing to achieve CI/CD testing, ultimately enabling you to deliver software applications faster.

With data analytics and now machine learning applications thirsty for more data to derive new business opportunities and customer satisfaction, IT shops will generate a significant amount of additional data in the form of copies that are distributed across business functions. I don't see this as a negative aspect from the perspective of capital expenditure, but it can be if organizations are not smart and agile enough to store this data and feed these systems at the speed they demand.

One way to solve for this is to through data virtualization. When highly-compressed, virtualized data copies are able to be produced from any data source (databases or file data) and made instantly available to applications or end-users, it helps increase storage efficiency while lowering overall cost. Update streams from source systems that are continuously captured and stored by the platform can allow virtual data copies to be synced at all times, minimizing human manual interventions and allow the end-users to focus on what really matters.

2. Complexities and costs associated with data security

A DBA knows well how complex it can be to maintain masking scripts for the multiple database technologies in-house. If lucky, a DBA will have specialized in one DB technology (say Oracle), but nowadays DBAs are required to be more generalists and support multiple database technologies. This makes the task of securing sensitive data harder. Masking scripts can be extremely complex and require a good level of knowledge of the DB structure itself.

As a result, many organizations will accept the risk of sharing sensitive data due to the costs and complexities of securing it the right way. This aspect became inexcusable this year in Europe with the launch of GDPR and will continue to have repercussions across all other regions with more regulations to come.

The other aspect of security has to do with what happens after a DBA provisions copies of databases to other functions. Once they provision copies of databases to enable other IT functions, they lack visibility into who has the copies, how many copies exist and who has access to them - which lends risk to allowing the secure flow of data.

What I recommend is an automated approach to data masking. Your process to keep data secure shouldn’t slow down development. Teams need to be able to seamlessly integrate masking with non-prod data distribution and ensure the security of sensitive data before that data is made available for development and testing, whether it’s on-prem or in the cloud. Once a masking policy is associated with a database source (the production DB), you can rest assured that every single copy provisioned by the DBA or the end-user will be masked.

3. With new business needs come new applications, and with new applications come modern technology

Modern business requirements will demand true digital transformation. This is hard to achieve for large and established organizations with large on-premises business-critical applications. There is no easy way to work around building a modern application in traditional on-premises data centers, but there is a faster way. Start from scratch without any infrastructure baggage and leverage the cloud with a SaaS or IaaS model.

Organizations that understand the flexibility and cost benefits of doing software development in the cloud should also know the benefits of working on applications that live in on-premises data centers. But the lack of data agility can become a barrier to this.

For example, if you’re moving application development to a modern infrastructure, adopting Agile and DevOps best practices to then slow things down due to the inefficient data flow to and from the developing and testing environments in your on-premises infrastructure is almost like setting up a F1 car with the intention to win the race against your competitors to then realize you still lose because you are using Prius wheels and tires (no offense to Prius owners).

Final Thoughts

We have to find a way to make the data to work for us, not against us. This is not a simple task, and there isn’t a one-size-fits-all approach to solve the problem of data agility.

While there is a lot of buzz about how and who can best solve for digital transformation, I believe there is a foundational change that has to do with leveraging data efficiently and enabling the secure flow of data across the enterprise.

Data is the key to today’s businesses and modern day software development. Leveraging data to drive innovation, identify new business models and add new value-added services and offerings delivers meaningful value to customers. Solving for data agility can dramatically reduce the time and complexities of your digital transformation journey and help unlock success for your enterprise.

Learn about how to bring data agility, accelerate innovation and secure data through the Delphix Data Platform.