Deploy Airflow DAGs to Google Cloud Composer using Cloud Build
Note: Cloud Build allows you to use available images from Docker Hub to run your tasks
Requirements:
- Make sure you have build a container registry image
- Install gcloud inside your container. Check this simple sample Dockerfile or a full repo
- Configure gcloud inside your Docker container
- Configure your Google Cloud Composer environment
- Create a yaml file typically same path of the folder you plan to run your tasks against
- Specify the image URL in the name field in your yml file. Ex:
gcr.io/cloud-builders/docker
- Use the args field to specify commands that you want to run within the image. Example
#cloudbuild.ymlsteps:- name:'ubuntu'args: ['echo', 'hello world']
- Test it locally in your machine this way you short the development circle and dry run to lint yml file. Install [Google Cloud SDK] (gcloud components install cloud-build-local) Docker
Make sure you have Docker credentials :
gcloud components install docker-credential-gcr
gcloud auth configure-docker
Run testcloud-build-local --config=path-to-cloudbuild.yaml --dryrun=false .
- Add a step to run your deploy commands from within your container. We can use gcloud
steps- name: 'gcr.io/cloud-builders/docker'entrypoint: 'bash'args:- '-c'- >docker run -i-v "$(pwd)"/src/DAGS:/DAGSgcr.io/$PROJECT_ID/airflow-composer:latestgcloud composer environments storage dags import--environment YOUR-ENVIRONMENT_NAME--location YOUR-REGION-LOCATION--source YOUR-DAG-LOCAL_FILE_TO_UPLOADsubstitutions:_ENV: localimages:- 'gcr.io/$PROJECT_ID/airflow-composer'
Note: Use your substitute variables as needed. $PROJECT_ID here is just going to be replaced at runtime for your ID of your Cloud project.
The -c argument is: