You Have Purchased Delphix Now What?
You have purchased Delphix and you are ready to take advantage of the agility and flexibility that the sales team promised. Or you are evaluating the product for use within your Data Migration project needing efficient DataOps and just don’t quite see where the technology fits into the data flow. I had been at that very juncture several years ago when my company looked at data virtualization to solve lengthy data provisioning processes and decided to purchase the Delphix Dynamic Data Platform (DDP). As the lead system architect, I was the person tasked with evaluating, planning, implementing, integrating and operationalizing this new data virtualization software into established DataOps processes and Infrastructure Technology and Development Operations. In order to succeed, I had to not only understand the Delphix Platform technology and all its great benefits, but I also needed to know where in my company’s data infrastructure transaction flow and architecture this virtualization tool would integrate that would gain the most DataOps agility the tool had to offer.
I suspect many new Delphix Clients, Channel Partners and Global System Integrators of Delphix find themselves in the same position as I describe. I have since matured in my understanding of Delphix and its many capabilities within infrastructure provisioning, data integration points. However, implementing the technology with a new organization is still a daunting task. I still remember the early days of introducing data virtualization and feeling overwhelmed at the task of bringing my company into the next generation of copy data management. To help the new comers to this technology and to help guide system integrators understand their role in serving virtual data integration plans, I decide to write this blog to highlight Delphix implementations from project planning, to transitioning, to next practices, to operations and to continued sustainability and growth.
In this initial series I will walk though a Delphix implementation project from start to finish. I will focus on a fictitious organization that has recently purchased licenses for Delphix. The organization purchased the initial Delphix DDP licenses to support the Healthcare IT operations line of business that includes a high availability engine for data protection. After that series I will jump into a project that includes adding Delphix to a hybrid cloud solution and then any other focus areas will come from followers wishing for a specific project or industry vertical rollout.
Delphix in a Nutshell
The Delphix Dynamic Data Platform (DDP) virtualizes databases and applications file systems to provide complete fully functional copies that operate in a fraction of the space that a physical copy would consume, with improved mobile agility, simplified manageability, and equal performance. The Delphix Software is a self-contained operating environment and application that is packaged as OVA to run as a virtual appliance or an image in a public cloud instance.
Enterprise level storage is attached to the Delphix engine and that storage will be used to house the ingested source copy, to update the source copy as changes occur, and to be the source storage for the downstream non-prod copies of that ingested source. Figure 1 provides a bird’s eye view of the Delphix data flow in a nutshell.
The #Delphix DDP links to source physical databases or application file systems via standard Application Programming Interfaces (API) and asks the source databases to send copies of their entire file and log blocks to it. Delphix uses intelligent filtering and compression to reduce the copy of the source database down to as little as 25% of the original size. The copy of the source database and/or source application stored in the Delphix Engine, along with all incremental updates, is referred to as the dSource in Delphix terminology.
When Delphix is linked to a source, it collects a backup copy of the database or file system, compresses it and stores the backup copy on its storage system. Delphix then keeps a history (called a TimeFlow) of the state of the source system by periodically taking incremental snapshots of the source and optionally collecting log files from it. From this history, Delphix can create a virtual copy of the source that is identical to the source at any point in its TimeFlow. Delphix software provisions the virtual copy to a Target Host in as little as 5 to 10 minutes using NFS or iSCSI protocol. The virtual copy can then be made available to downstream users as Datapods that allows end-users self-service functionality to can stop, start, rewind, refresh bookmark or share their copy without additional support from a Database or Application Administrator. Figure 2 shows the data flow process within the Delphix DDP.
Delphix Operating Considerations
The Delphix technology manipulates the ingested sources as single objects at the storage level. Delphix Datapods offer data users and developers the opportunity to roll back, rewind, share, branch and bookmark operations on a virtual object level. However, there are caveats one must consider. First and foremost the self-service features of the Datapods are tied to the objects as a whole, which means internal object components such as schemas, tables, or stored procedures cannot be individually targeted for these operations.
Another thing to consider, revolves around standard resource capacity planning. As with any technology the Delphix platform relies on adequate server CPU and memory allocation. The Delphix platform uses these resources and must be monitored closely by system engineers to ensure the operations of rolling forward, rolling, back and provisioning additional virtual environments or dSources remain within acceptable control limits and performance needs of the resources. So I think you have enough background information to follow this blog post without problem. If you need additional information regarding Delphix technology before following my blog, please go to http://www.delphix.com or just search for Delphix online. Next blog will discuss creating a solution Design Document to help you architect the deployment and operating plan.