Use LatestOnlyOperator to skip some tasks while running a backfill in Airflow
Occasionally, when we use Airflow, we have a DAG which always works on the most recent snapshot of data even if we run a backfill for days in the past. It happens when at least one of the tasks downloads a snapshot of the current state from an external API, uploads data into another service (using an API, FTP upload, sending events to a message queue, or anything else), or pushes some information to users.
The last case is the worst. Certainly, we don’t want to send old versions of newsletters to all of the subscribers just because we had to backfill some values in a report.
How do we make sure that such time-sensitive tasks are executed only in the most recent DAG run and don’t start when we run a backfill? We can do that by using the LatestOnlyOperator
in Airflow!
It is simple to use it. All we need to do is importing the operator, creating a new instance, and adding it to the DAG:
1
2
3
4
5
from airflow.operators.latest_only_operator import LatestOnlyOperator
is_this_latest_dag_run = LatestOnlyOperator(task_id="dont_send_newsletters_during_backfills", dag=dag)
upstream_task >> is_this_latest_dag_run >> send_newsletters >> other_downstream_task
When the operator runs, it checks whether the current time is between the most recent execution time and the next scheduled execution time. If yes, it lets the downstream tasks run. If no, it means that we are running a backfill, and the downstream tasks get skipped.
Did you enjoy reading this article?
Would you like to learn more about leveraging AI to drive growth and innovation, software craft in data engineering, and MLOps?
Subscribe to the newsletter or add this blog to your RSS reader (does anyone still use them?) to get a notification when I publish a new essay!
You may also like
- How to prevent Airflow from backfilling old DAG runs
- How to restart a stuck Airflow DAG
- How to use Virtualenv to prepare a separate environment for Python function running in Airflow
- Why my Airflow tasks got stuck in "no_status" and how I fixed it
- What to do when Airflow BashOperator fails with TemplateNotFound error

Bartosz Mikulski
- MLOps engineer by day
- AI and data engineering consultant by night
- Python and data engineering trainer
- Conference speaker
- Contributed a chapter to the book "97 Things Every Data Engineer Should Know"
- Twitter: @mikulskibartosz
- Mastodon: @mikulskibartosz@mathstodon.xyz