How to add a manual step to an Airflow DAG using the JiraOperator

How can we add a human action in the middle of an Airflow DAG? This is not an everyday use case, but it is also not something useless. Occasionally, we may need a human confirmation before executing code that may destroy data or our reputation.

For example, we may have a DAG that prepares a newsletter. The DAG’s last task sends it to the subscribers, but we want to wait until a manager approves the content before we send anything.

We can wait for a manual step also when we implement personal data deletion. Our DAG may gather all of the data to be removed, make a list of affected datasets, and send it to a person for final approval before everything gets deleted.

In all of those situations, we can use the JiraOperator to create a Jira ticket and the JiraSensor to wait until the ticket’s status changes to whatever value we use as confirmation.

Creating a new issue

First, we have to create a new ticket. For this, we import the JiraOperator, which gives us access to the Jira Python SDK.

from airflow.contrib.operators.jira_operator import JiraOperator

issue_dict = {
    'project': {'id': 123},
    'summary': 'Confirmation required',
    'description': 'Some description',
    'issuetype': {'name': 'Request'},
}

# assuming that your version of the API returns an Issue, not a dictionary
extract_issue_key = lambda issue: issue.id

create_jira_issue = JiraOperator(
    task_id='get_human_approval',
    jira_conn_id='connection_id',
    jira_method='create_issue',
    jira_method_args=issue_dict,
    result_processor=extract_issue_key
)

The issue key extracted by the function we provided will end up in the XCom result of this operator.

Waiting for the status

In the next step, we wait until the issue has a desired status using a JiraSensor:

sensor = JiraSensor(
    task_id='check_if_approved',
    jira_conn_id='connection_id',
    ticket_id="{{ task_instance.xcom.pull('get_human_approval', key='return_value') }}", 
    field='status',
    expected_value='Done'
)
Older post

How Data Mechanics can reduce your Apache Spark costs by 70%

Stop wasting time and money tuning Apache Spark parameters

Newer post

How to run PySpark code using the Airflow SSHOperator

How to submit a PySpark job using SSHOperator in Airflow