Cloud Composer1|Cloud Composer2
This page describes different ways to trigger DAGs inCloud Composer environments.
Airflow provides the following ways to trigger a DAG:
Trigger on a schedule. When you create a DAG, you specify aschedule for it. Airflow triggers the DAG automatically based on thespecified scheduling parameters.
Trigger manually. You can trigger a DAG manually from theAirflow UI, or by running an Airflow CLI command from
gcloud
.Trigger in response to events. The standard way to trigger a DAG inresponse to events is to use a sensor.
Other ways to trigger DAGs:
Trigger programmatically. You cantrigger a DAG using the Airflow REST API.For example, from a Python script.
Trigger programmatically in response to events. You can trigger DAGs inresponse to events by usingCloud Functions and the Airflow REST API.
Trigger a DAG on a schedule
To trigger a DAG on a schedule:
- Specify the
start_date
andschedule_interval
parameters in the DAGfile, as described later in this section. - Upload the DAG file to yourenvironment.
Specify scheduling parameters
When you define a DAG, in the schedule_interval
parameter, you specify howoften you want to run the DAG. In the start_date
parameter, you specify whenyou want Airflow to start scheduling your DAG. Tasks in your DAG can haveindividual start dates, or you can specify a single start date for all tasks.Based on the minimum start date for tasks in your DAG and on the scheduleinterval, Airflow schedules DAG runs.
Scheduling works in the following way. After the start_date
passes, Airflowwaits for the following occurrence of schedule_interval
. Then it schedulesthe first DAG run to happen at the end of this schedule interval. For example,if a DAG is scheduled to run every hour (schedule_interval
is 1 hour) andthe start date is at 12:00 today, the first DAG run happens at 13:00 today.
The following example shows a DAG that runs every hour starting from 15:00 onApril 5, 2021. With the parameters used in the example, Airflow schedules thefirst DAG run to happen at 16:00 on April 5, 2021.
from datetime import datetimefrom airflow import DAGfrom airflow.operators.bash_operator import BashOperatorwith DAG( dag_id='example_dag_schedule', # At 15:00 on 5 April, 2021 start_date=datetime(2021, 4, 5, 15, 0), # At minute 0 of every hour schedule_interval='0 * * * *') as dag: # Output the current date and time t1 = BashOperator( task_id='date', bash_command='date', dag=dag) t1
For more information about the scheduling parameters, seeDAG Runs in the Airflow documentation.
More scheduling parameter examples
Following scheduling parameter examples illustrate how scheduling works withdifferent combinations of parameters:
- If
start_date
isdatetime(2021, 4, 4, 16, 25)
andschedule_interval
is30 16 * * *
, then the first DAG run happens at 16:30 on 5 April, 2021. - If
start_date
isdatetime(2021, 4, 4, 16, 35)
andschedule_interval
is30 16 * * *
, then the first DAG run happens at 16:30 on 6 April, 2021.Because the start date is after the schedule interval on 4 April, 2021,the DAG run does not happen on 5 April, 2021. Instead, the scheduleinterval ends at 16:35 on 5 April, 2021, so the next DAG run is scheduledfor 16:30 on the following day. - If
start_date
isdatetime(2021, 4, 4)
, and theschedule_interval
is@daily
, then the first DAG run is scheduled for 00:00 on April 5, 2021. - If
start_date
isdatetime(2021, 4, 4, 16, 30)
, andtheschedule_interval
is0 * * * *
, then the first DAG run isscheduled for 18:00 on April 4, 2021. After the specified date and timepasses, Airflow schedules a DAG run to happen at the minute 0 of everyhour. The nearest point in time when this happens is 17:00. At this time,Airflow schedules a DAG run to happen at the end of the schedule interval,that is, at 18:00.
Trigger a DAG manually
When you trigger a DAG manually, Airflow performs a DAG run. For example,if you have a DAG that already runs on a schedule, and you trigger this DAGmanually, then Airflow executes your DAG once, independently from the actualschedule specified for the DAG.
Console
DAG UI is supported in Cloud Composer 1.17.8 and later versions.
To trigger a DAG from Google Cloud console:
In the Google Cloud console, go to the Environments page.
Go to Environments
Select an environment to view its details.
On the Environment details page, go to the DAGs tab.
Click the name of a DAG.
On the DAG details page, click Trigger DAG. A new DAG run iscreated.
Airflow UI
To trigger a DAG from the Airflow web interface:
- In the Google Cloud console, go to the Environments page.
Go to Environments
In the Airflow webserver column, follow the Airflow link for yourenvironment.
Log in with the Google account that has the appropriate permissions.
In the Airflow web interface, on the DAGs page, in the Linkscolumn for your DAG, click the Trigger Dag button.
(Optional) Specify the DAG run configuration.
Click Trigger.
gcloud
In Airflow 1.10.12 or earlier, run the trigger_dag
Airflow CLI command:
gcloud composer environments run ENVIRONMENT_NAME \ --location LOCATION \ trigger_dag -- DAG_ID
In Airflow 1.10.14 or later, including Airflow 2, run the dags trigger
Airflow CLI command:
gcloud composer environments run ENVIRONMENT_NAME \ --location LOCATION \ dags trigger -- DAG_ID
Replace:
ENVIRONMENT_NAME
with the name of the environment.LOCATION
with the region where the environment is located.DAG_ID
with the name of the DAG.
For more information about running Airflow CLI commands inCloud Composer environments, seeRunning Airflow CLI commands.
For more information about the available Airflow CLI commands, seethe gcloud composer environments run command reference.
What's next
- Managing DAGs
- Writing DAGs
- Access the Airflow UI