How to set Airflow variables while creating a dev environment
When we setup a development environment with Airflow, it is quite annoying to copy-paste the variables and connections from production. Of course, we cannot copy all of them. We will have to modify some values, but that should not stop us from copying everything else.
This article shows how to use the Airflow command-line to export variables from the production environment and import them in the script that builds your development Airflow instance.
Export Airflow variables
First, we have to use the airflow variables export
command to get a JSON file with the production parameters. We must run this command on the server that runs the production Airflow environment!
1
airflow variables export prod.json
Prepare dev configuration
After that, we can use jq
to modify the value that must be different in the dev environment. For example, the following code replaces the “some_field” property with “the_new_value”:
1
2
contents="$(jq '.some_field = "the_new_value"' prod.json)" && \
echo "${contents}" > dev.json
Import dev configuration
When we have the development configuration ready, we can load it using airflow variables import
. To do that, we have to start the Airflow instance and run this command on the server that runs the development Airflow environment. Of course, we also have to copy the files between servers, but that depends on your setup.
1
airflow variables export dev.json
Did you enjoy reading this article?
Would you like to learn more about leveraging AI to drive growth and innovation, software craft in data engineering, and MLOps?
Subscribe to the newsletter or add this blog to your RSS reader (does anyone still use them?) to get a notification when I publish a new essay!
You may also like
- How to add an EMR step in Airflow and wait until it finishes running
- How to run PySpark code using the Airflow SSHOperator
- How to postpone Airflow DAG until files get uploaded into an S3 bucket
- Use a custom function in Airflow templates
- How to use Virtualenv to prepare a separate environment for Python function running in Airflow

Bartosz Mikulski
- MLOps engineer by day
- AI and data engineering consultant by night
- Python and data engineering trainer
- Conference speaker
- Contributed a chapter to the book "97 Things Every Data Engineer Should Know"
- Twitter: @mikulskibartosz
- Mastodon: @mikulskibartosz@mathstodon.xyz