Use a custom function in Airflow templates
This article is a part of my "100 data engineering tutorials in 100 days" challenge. (47/100)
Adding a custom function to Airflow is quite simple. First, we have do define a function in Python, for example, this one:
1
2
3
def do_something_with_execution_date(execution_date):
# Imagine that there is some useful code ;)
...
When the function is ready, we use the user_defined_macros
parameter of the DAG object to pass a dictionary of custom functions:
1
2
3
4
5
6
dag = DAG(
...,
user_defined_macros={
'custom_function': do_something_with_execution_date,
}
)
Now, we can use the custom function in any place that supports Airflow templates. Of course, only in the DAGs that have access to the functions.
1
{{ custom_function(execution_date) }};
Note that, I can pass parameters to the function and rename it by using a different name as the dictionary key.
Parsing machine learning logs with Ahana, a managed Presto service, and Cube, a headless BI solution

Check out my article published on the Cube.dev blog!
You may also like
- How to postpone Airflow DAG until files get uploaded into an S3 bucket
- Run a command on a remote server using SSH in Airflow
- How to render an Airflow template for testing
- How to set Airflow variables while creating a dev environment
- How to use Virtualenv to prepare a separate environment for Python function running in Airflow
Bartosz Mikulski
- Data/MLOps engineer by day
- DevRel/copywriter by night
- Python and data engineering trainer
- Conference speaker
- Contributed a chapter to the book "97 Things Every Data Engineer Should Know"
- Twitter: @mikulskibartosz