Richard Seroter, Director of Developer Relations at Google Cloud, wrote up this great blog and shared it online earlier.
In the blog, he breaks down a few ways you can streamline your process by automating with Liquibase and BigQuery using 4 methods:
- Use the Liquibase CLI locally to add columns to a BigQuery table. This is an easy way to get started!
- Use the Liquibase Docker image to add columns to a BigQuery table. See how to deploy changes through a Docker container, which makes later automation easier.
- Use the Liquibase Docker image within Cloud Build to automate deployment of a BigQuery table change. Bring in continuous integration (and general automation service) Google Cloud Build to invoke the Liquibase container to push BigQuery changes.
- Use Cloud Build and Cloud Deploy to automate the build and deployment of the app to GKE along with a BigQuery table change. This feels like the ideal state, where Cloud Build does app packaging, and then hands off to Cloud Deploy to push BigQuery changes (using the Docker image) and the web app through dev/test/prod.
Here’s the link to the post:
Continuously deploy your apps AND data? Let’s try to use Liquibase for BigQuery changes.
What do you think about the processes he outlined? What would you do differently?