How to use Virtualenv to prepare a separate environment for Python function running in Airflow

This article is a part of my "100 data engineering tutorials in 100 days" challenge. (56/100)

It is not difficult to turn your Python environment into a mess. Soon, the libraries become incompatible with one another, start producing weird results or suddenly crash in the middle of a computation.

Fortunately, we can create separate environments using Virtualenv or Conda. This feature is also available in Airflow, but in this case, we have access only to Virtualenv (unless you add a custom operator).

First, we have to define a Python function we want to run. Note that we must define ALL imports inside the function, and it cannot reference anything defined outside. Even if it is a global variable. We must pass all such variables as arguments of the PythonVirtualenvOperator.

def some_python_function():
    import pandas as pd

    # do something with Pandas

    return "some value"

The returned value is available in the Airflow XCOM, and we can reference it in the subsequent tasks.

There is one issue concerning returned values (and input parameters). If the Python version used in the Virtualenv environment differs from the Python version used by Airflow, we cannot pass parameters and return values. In this case, we can use only the string_args parameter.

Subscribe to the newsletter and join the free email course.

Use PythonVirtualenvOperator

Now, I can configure the Airflow operator. I pass the required libraries as the requirements parameter. It supports the same syntax as the requirements.txt file, so I can also define a version:

virtualenv_task = PythonVirtualenvOperator(

Remember to share on social media!
If you like this text, please share it on Facebook/Twitter/LinkedIn/Reddit or other social media.

If you want to contact me, send me a message on LinkedIn or Twitter.

Would you like to have a call and talk? Please schedule a meeting using this link.

Bartosz Mikulski
Bartosz Mikulski * MLOps Engineer / data engineer * conference speaker * co-founder of Software Craft Poznan & Poznan Scala User Group

Subscribe to the newsletter and get access to my free email course on building trustworthy data pipelines.