The Missing Piece of CI/CD: Automating Data Flow
The advent of cloud technology has had a profound impact on the modern software development life cycle (SDLC). Developers have moved away from large, monolithic applications that run on a single server to delivering Software as a Service (SaaS) applications to take advantage of cheaper and ubiquitous infrastructure that scales horizontally based on the volume needed.
But with these major advancements comes challenges. Cloud-based workflows require every data environment to run on the appropriate infrastructure versions and have the proper schema and data state to operate seamlessly throughout development.
For this reason, automation tools have become critical in enabling CI/CD for different application states as well as managing different parts of the application stack by using concepts such as infrastructure as code and schema as code. While the tools to manage different code versions between states have matured over the years, there is currently no solution to manage different data states for different environments.
Enter Delphix Automation Framework (DAF). This tool automates the Delphix platform’s API calls customized for each state during CI/CD to help manage different data states for application development. Simply put, DAF provides elasticity, self-service and automation for SDLC workflows for data.
Here are 3 steps to get started:
- Connect your production database
- Building a self-service template to represent your project
- Creating individual data pods for dev/test environments.
By adding a configuration file (delphix.yaml) to your application source code repository, DAF will ensure that each of your non-production environments will be at the proper state during your deployment automation.
We created a demo application to illustrate some of the main features in AWS, a cloud-native environment. The application has the following:
- Java Spring Boot API backend with a Postgres RDS database
- Single page application frontend with Angular 6
- Code repository hosted with GitHub
- Webhooks set up to push events to Jenkins for CI/CD automation
- Infrastructure as code managed with Terraform and Packer
- Schema as code using Datical
- Different data states managed with the Delphix Automation Framework
In the sample workflow, we add a new field called “Notes” to the patient records. We then update the API and UI, so that our application now has this new feature.
For the deployment workflow, when we merge the new feature pull-request into the “Develop” branch in GitHub, a webhook is sent to Jenkins to trigger the develop build. It applies any infrastructure changes with Terraform.
To make sure that our schema change won’t have any issues when it reaches production, the “Develop” data pod is updated with the latest masked sync from the production database, and the schema changes are applied with Datical.
We want to catch data issues earlier in the life cycle, so it’s important to maintain the relationship between the “Develop” data pod and the production database with the Delphix platform. That way we can rely on Delphix to keep our “Develop” data pod up-to-date with production-like data.
After the schema has successfully migrated, Ansible will build the new application stack and deploy it to the “Develop” web server.
The Delphix Automation Framework gives development teams the ability to manage their non-production data sets with configuration code that follows a Given->When->Then convention for ease. Developers can customize how a data set should be at a particular time by creating events and then chaining actions for that event.
Watch the full demo video above for more information or download the latest version of the Delphix Automation Framework and demo source code to get started today.