Deploy Airflow DAGs to Cloud Composer using Cloud Build

By WikiExplain on 2023-03-14 · 1 mins read
Airflow

Deploy Airflow DAGs to Google Cloud Composer using Cloud Build

buildtrigger

Note: Cloud Build allows you to use available images from Docker Hub to run your tasks

Requirements:

gcr.io/cloud-builders/docker

  • Use the args field to specify commands that you want to run within the image. Example
#cloudbuild.yml
steps:
- name:
'ubuntu'
args: ['echo', 'hello world']
  • Test it locally in your machine this way you short the development circle and dry run to lint yml file. Install [Google Cloud SDK] (gcloud components install cloud-build-local) Docker Make sure you have Docker credentials : gcloud components install docker-credential-gcr gcloud auth configure-docker Run test cloud-build-local --config=path-to-cloudbuild.yaml --dryrun=false .
  • Add a step to run your deploy commands from within your container. We can use gcloud
steps
- name: 'gcr.io/cloud-builders/docker'
entrypoint: 'bash'
args:
- '-c'
- >
docker run -i
-v "$(pwd)"/src/DAGS:/DAGS
gcr.io/$PROJECT_ID/airflow-composer:latest
gcloud composer environments storage dags import
--environment YOUR-ENVIRONMENT_NAME
--location YOUR-REGION-LOCATION
--source YOUR-DAG-LOCAL_FILE_TO_UPLOAD
substitutions:
_ENV: local
images:
- 'gcr.io/$PROJECT_ID/airflow-composer'

Note: Use your substitute variables as needed. $PROJECT_ID here is just going to be replaced at runtime for your ID of your Cloud project.

The -c argument is:

Read commands from the command_string operand instead of from the standard input. Special parameter 0 will be set from the command_name operand and the positional parameters (1,1,2, etc.) set from the remaining argument operands. EX: $ sh -c "echo This is a test string"