How to set Airflow variables while creating a dev environment

This article is a part of my "100 data engineering tutorials in 100 days" challenge. (65/100)

When we setup a development environment with Airflow, it is quite annoying to copy-paste the variables and connections from production. Of course, we cannot copy all of them. We will have to modify some values, but that should not stop us from copying everything else.

This article shows how to use the Airflow command-line to export variables from the production environment and import them in the script that builds your development Airflow instance.

Export Airflow variables

First, we have to use the airflow variables export command to get a JSON file with the production parameters. We must run this command on the server that runs the production Airflow environment!

airflow variables export prod.json

Prepare dev configuration

After that, we can use jq to modify the value that must be different in the dev environment. For example, the following code replaces the “some_field” property with “the_new_value”:

contents="$(jq '.some_field = "the_new_value"' prod.json)" && \
echo "${contents}" > dev.json

Subscribe to the newsletter and join the free email course.

Import dev configuration

When we have the development configuration ready, we can load it using airflow variables import. To do that, we have to start the Airflow instance and run this command on the server that runs the development Airflow environment. Of course, we also have to copy the files between servers, but that depends on your setup.

airflow variables export dev.json

Remember to share on social media!
If you like this text, please share it on Facebook/Twitter/LinkedIn/Reddit or other social media.

If you want to contact me, send me a message on LinkedIn or Twitter.

Bartosz Mikulski
Bartosz Mikulski * MLOps Engineer / data engineer * conference speaker * co-founder of Software Craft Poznan & Poznan Scala User Group

Subscribe to the newsletter and get access to my free email course on building trustworthy data pipelines.