Branching in Airflow

When designing your data pipelines, you may encounter use cases that require more complex task flows than “Task A > Task B > Task C”. For example, you may have a use case where you need to decide between multiple tasks to execute based on the results of an upstream task. Or you may have a case where part of your pipeline should only run under certain external conditions. Fortunately, Airflow has multiple options for building conditional logic and/or branching into your DAGs.

In this guide, you’ll learn how you can use @task.branch (BranchPythonOperator) and @task.short_circuit (ShortCircuitOperator), other available branching operators, and additional resources to implement conditional logic in your Airflow DAGs.

Assumed knowledge

To get the most out of this guide, you should have an understanding of:

@task.branch (BranchPythonOperator)

One of the simplest ways to implement branching in Airflow is to use the @task.branch decorator, which is a decorated version of the BranchPythonOperator. @task.branch accepts any Python function as an input as long as the function returns a list of valid IDs for Airflow tasks that the DAG should run after the function completes.

In the following example we use a choose_branch function that returns one set of task IDs if the result is greater than 0.5 and a different set if the result is less than or equal to 0.5:

Taskflow
1# from airflow.sdk import task
2
3result = 1
4
5@task.branch
6def choose_branch(result):
7 if result > 0.5:
8 return ['task_a', 'task_b']
9 return ['task_c']
10
11choose_branch(result)
Traditional
1# from airflow.providers.standard.operators.python import BranchPythonOperator
2
3result = 1
4
5def choose_branch(result):
6 if result > 0.5:
7 return ['task_a', 'task_b']
8 return ['task_c']
9
10branching = BranchPythonOperator(
11 task_id='branching',
12 python_callable=choose_branch,
13 op_args=[result]
14)

In general, the @task.branch decorator is a good choice if your branching logic can be easily implemented in a simple Python function. Whether you want to use the decorated version or the traditional operator is a question of personal preference.

The code below shows a full example of how to use @task.branch in a DAG:

Taskflow
1"""Example DAG demonstrating the usage of the `@task.branch`
2TaskFlow API decorator."""
3
4from airflow.sdk import dag, Label, task
5from airflow.providers.standard.operators.empty import EmptyOperator
6
7import random
8
9@dag
10def branch_python_operator_decorator_example():
11
12 run_this_first = EmptyOperator(task_id="run_this_first")
13
14 options = ["branch_a", "branch_b", "branch_c", "branch_d"]
15
16 @task.branch(task_id="branching")
17 def random_choice(choices):
18 return random.choice(choices)
19
20 random_choice_instance = random_choice(choices=options)
21
22 run_this_first >> random_choice_instance
23
24 join = EmptyOperator(
25 task_id="join",
26 trigger_rule="none_failed_min_one_success"
27 )
28
29 for option in options:
30
31 t = EmptyOperator(
32 task_id=option
33 )
34
35 empty_follow = EmptyOperator(
36 task_id="follow_" + option
37 )
38
39 # Label is optional here, but it can help identify more complex branches
40 random_choice_instance >> Label(option) >> t >> empty_follow >> join
41
42branch_python_operator_decorator_example()
Traditional
1"""Example DAG demonstrating the usage of the BranchPythonOperator."""
2
3from airflow.sdk import DAG, Label
4from airflow.providers.standard.operators.empty import EmptyOperator
5from airflow.providers.standard.operators.python import BranchPythonOperator
6import random
7
8with DAG(
9 dag_id='branch_python_operator_example'
10) as dag:
11
12 run_this_first = EmptyOperator(
13 task_id='run_this_first',
14 )
15
16 options = ['branch_a', 'branch_b', 'branch_c', 'branch_d']
17
18 branching = BranchPythonOperator(
19 task_id='branching',
20 python_callable=lambda: random.choice(options),
21 )
22
23 run_this_first >> branching
24
25 join = EmptyOperator(
26 task_id='join',
27 trigger_rule="none_failed_min_one_success",
28 )
29
30 for option in options:
31
32 t = EmptyOperator(
33 task_id=option,
34 )
35
36 empty_follow = EmptyOperator(
37 task_id='follow_' + option,
38 )
39
40 # Label is optional here, but it can help identify more complex branches
41 branching >> Label(option) >> t >> empty_follow >> join

In this DAG, random.choice() returns one random option out of a list of four branches. In the following screenshot, where branch_b was randomly chosen, the two tasks in branch_b were successfully run while the others were skipped.

Branching Graph View

If you have downstream tasks that need to run regardless of which branch is taken, like the join task in the previous example, you need to update the trigger rule. The default trigger rule in Airflow is all_success, which means that if upstream tasks are skipped, then the downstream task will not run. In the previous example, none_failed_min_one_success is specified to indicate that the task should run as long as one upstream task succeeded and no tasks failed.

You can also set a task group as the direct downstream element of a branching task by returning its task_group_id in your decorated function or python_callable instead of a task_id. All root tasks of the task group run if the branching tasks return the task_group_id.

Click to view sample DAG code and a corresponding task graph.
1from airflow.decorators import dag, task_group, task
2from airflow.models.baseoperator import chain
3from pendulum import datetime
4
5
6@dag(
7 dag_display_name="Task Group Branching",
8 start_date=datetime(2024, 8, 1),
9 schedule=None,
10 catchup=False,
11 tags=["Branching"],
12)
13def task_group_branching():
14
15 @task.branch
16 def upstream_task():
17 return "my_task_group"
18
19 @task_group
20 def my_task_group():
21
22 @task
23 def t1():
24 return "hi"
25
26 t1()
27
28 @task
29 def t2():
30 return "hi"
31
32 t2()
33
34 @task
35 def outside_task():
36 return "hi"
37
38 chain(upstream_task(), [my_task_group(), outside_task()])
39
40
41task_group_branching()

Screenshot of graph in UI of DAG using task grouping.

Finally, note that with the @task.branch decorator your Python function must return at least one task ID for whichever branch is chosen (in other words, it can’t return nothing). If one of the paths in your branching should do nothing, you can use an EmptyOperator in that branch.

@task.short_circuit (ShortCircuitOperator)

Another option for implementing conditional logic in your DAGs is the @task.short_circuit decorator, which is a decorated version of the ShortCircuitOperator. This operator takes a Python function that returns True or False based on logic implemented for your use case. If True is returned, the DAG continues, and if False is returned, all downstream tasks are skipped.

@task.short_circuit is useful when you know that some tasks in your DAG should run only occasionally. For example, maybe your DAG runs daily, but some tasks should only run on Sundays. Or maybe your DAG orchestrates a machine learning model, and tasks that publish the model should only be run if a certain accuracy is reached after training. This type of logic can also be implemented with @task.branch, but that operator requires a task ID to be returned. Using the @task.short_circuit decorator can be cleaner in cases where the conditional logic equates to “run or not” as opposed to “run this or that”.

The following DAG shows an example of how to implement @task.short_circuit:

Taskflow
1"""Example DAG demonstrating the usage of the @task.short_circuit decorator."""
2
3from airflow.sdk import dag, task, chain
4from airflow.providers.standard.operators.empty import EmptyOperator
5
6@dag
7def short_circuit_operator_decorator_example():
8
9 @task.short_circuit
10 def condition_is_True():
11 return True
12
13 @task.short_circuit
14 def condition_is_False():
15 return False
16
17 ds_true = [EmptyOperator(task_id='true_' + str(i)) for i in [1, 2]]
18 ds_false = [EmptyOperator(task_id='false_' + str(i)) for i in [1, 2]]
19
20 chain(condition_is_True(), *ds_true)
21 chain(condition_is_False(), *ds_false)
22
23short_circuit_operator_decorator_example()
Traditional
1"""Example DAG demonstrating the usage of the ShortCircuitOperator."""
2
3from airflow.sdk import DAG, chain
4from airflow.providers.standard.operators.empty import EmptyOperator
5from airflow.providers.standard.operators.python import ShortCircuitOperator
6
7with DAG(
8 dag_id='short_circuit_operator_example'
9) as dag:
10
11 cond_true = ShortCircuitOperator(
12 task_id='condition_is_True',
13 python_callable=lambda: True,
14 )
15
16 cond_false = ShortCircuitOperator(
17 task_id='condition_is_False',
18 python_callable=lambda: False,
19 )
20
21 ds_true = [EmptyOperator(task_id='true_' + str(i)) for i in [1, 2]]
22 ds_false = [EmptyOperator(task_id='false_' + str(i)) for i in [1, 2]]
23
24 chain(cond_true, *ds_true)
25 chain(cond_false, *ds_false)

In this DAG there are two short circuits, one which always returns True and one which always returns False. When you run the DAG, you can see that tasks downstream of the True condition operator ran, while tasks downstream of the False condition operator were skipped.

Short Circuit

Other branch operators

Airflow offers a few other branching operators that work similarly to the BranchPythonOperator but for more specific contexts:

All of these operators take follow_task_ids_if_true and follow_task_ids_if_false parameters which provide the list of task(s) to include in the branch based on the logic returned by the operator.