How to trigger an Airflow DAG from another DAG
I wondered how to use the
TriggerDagRunOperator operator since I learned that it exists. I had a few ideas.
There is a concept of SubDAGs in Airflow, so extracting a part of the DAG to another and triggering it using the
TriggerDagRunOperator does not look like a correct usage.
The next idea was using it to trigger a compensation action in case of a DAG failure. We can use the
BranchPythonOperator to define two code execution paths, choose the first one during regular operation, and the other path in case of an error. In the other branch, we can trigger another DAG using the trigger operator. However, that does not make any sense either. I could put all of the compensation tasks in the other code branch and not bother using the trigger operator and defining a separate DAG.
On the other hand, if I had a few DAGs that require the same compensation actions in case of failures, I could extract the common code to a separate DAG and add only the
BranchPythonOperator and the
TriggerDagRunOperator to all of the DAGs that must fix something in a case of a failure.
The next idea I had was extracting an expansive computation that does not need to run every time to a separate DAG and trigger it only when necessary. For example, when the input data contains some values.
Still, all of those ideas a little bit exaggerated and overstretched. Perhaps, most of the time, the
TriggerDagRunOperator is just overkill.
The usage of
TriggerDagRunOperator is quite simple. All we need is this code:
1 2 3 4 5 trigger = TriggerDagRunOperator( task_id="trigger_id", trigger_dag_id="the_id_of_another_dag", dag=dag, )
You may also like
- How to add a manual step to an Airflow DAG using the JiraOperator
- What to do when Airflow BashOperator fails with TemplateNotFound error
- Why my Airflow tasks got stuck in "no_status" and how I fixed it
- How to use AWSAthenaOperator in Airflow to verify that a DAG finished successfully
- Run a command on a remote server using SSH in Airflow