Cross-deployment dependencies

Cross-Dag dependencies serve a common use case: configuring a Dag to run when a separate Dag or a task in another Dag completes or updates an asset. But what about situations in which an asset or Dag you monitor exists in a separate deployment? For example, you might want to make a task dependent on an asset update in a Dag that is owned by a different team and located in a separate deployment. Astro also supports the orchestration of tasks using this kind of relationship, which is referred to as a cross-deployment dependency.

This guide uses the following terms to describe cross-deployment dependencies:

  • Upstream deployment: A deployment where a Dag must reach a specified state before a Dag in another deployment can run.
  • Downstream deployment: A deployment in which a Dag cannot run until a Dag in an upstream deployment reaches a specified state.

Cross-deployment dependencies require special implementation because some methods, like the TriggerDagRunOperator, ExternalTaskSensor, and direct Airflow Asset dependencies, are only designed for Dags in the same deployment. On Astro, there are two recommended methods available for implementing cross-deployment dependencies: Astro Alerts and triggering updates to Airflow Assets using the Airflow REST API.

Feature overview

In this guide, you’ll learn when to use the following Astro and Airflow features to create dependencies across Astro deployments. Astro supports cross-deployment dependencies in any Workspace or cluster.

  • Astro Alerts. Recommended for most Astro use cases as no code modification is required.
  • Airflow Assets. Can be used to trigger a Dag after a task in another Dag updates an asset.

Best practice guidance

Astro Alerts and Airflow Assets are the best methods for implementing cross-deployment dependencies. The dagRuns endpoint of the Airflow API can also be used for this purpose and might be appropriate in cases where you want tasks that don’t update assets to trigger Dags. This method isn’t covered here, but you can implement it by following the guidance in Airflow REST API.

To determine whether an Astro Alert or the Assets feature is the right solution for your use case, consider the following guidance.

Astro Alerts:

You can use Astro alerts to implement cross-deployment Dag dependencies using the Dag trigger communication channel. They are simple to implement and are the preferred method in the following situations:

  • If you need to implement a dependency trigger based on any Dag state other than success, such as a Dag failure, a task taking longer than expected, or a Dag not completing by a certain time.
  • If you need to implement a simple one-to-one cross-deployment dependency (one upstream Dag triggers one downstream dag) and do not want to update your Dag code.
  • When your Dags don’t already use the Assets feature and when it is easy to identify the relevant dependent dags, which isn’t always the case in larger organizations.

Airflow Assets:

Assets represent a significant evolution in the way Airflow can be used to define dependencies and, for some, offer a more natural way of expressing pipelines than traditional dags. Assets, which offer more flexibility for cross-deployment dependencies than Astro alerts, are the preferred method in the following scenarios:

  • You need to implement dependencies in a many-to-one pattern, so you can make a Dag dependent on the completion of multiple other Dags or tasks. This is not possible using Astro Alerts.
  • You need to implement dependencies in a one-to-many pattern, so one Dag or task triggers multiple Dags or tasks. While this is also possible using Astro Alerts, it requires a separate alert for each dependency.
In Airflow 3, Datasets were renamed to Assets. You can update an asset with a POST request to the assets endpoint of the Airflow REST API, which supports implementation across deployments. This guide uses the Airflow REST API v2 (/api/v2/assets/events) available in Airflow 3.

Astro alerts example

Assumed knowledge

To use Astro Alerts to create cross-deployment dependencies, you should have an understanding of:

Implementation

This example shows you how to create dependencies between Dags in different Astro deployments using Astro Alerts.

Prerequisites

Process

Create a dependency between Dags in separate deployments with an alert trigger on Astro.

  1. First, create a Deployment API token for the upstream Deployment.
  2. Click Alerts in the Workspace menu, and create a new alert.
  3. Enter an Alert name that you’ll remember, select the Alert type (like Dag Success), and then select Dag Trigger as the Communication Channel.
  4. In the Deployment menu for the Dag Trigger communication channel, select the downstream deployment from the list.
  5. In the Dag NAME menu, select the Dag you want to trigger.
  6. Paste your API token in the DEPLOYMENT API TOKEN field.
  7. Run the upstream dag, verify that the alert triggers, and confirm the downstream Dag runs as expected.

Assets example

Assumed knowledge

To use Airflow Assets to create cross-deployment dependencies, you should have an understanding of:

Implementation

This section explains how to use the Airflow REST API v2 create_asset_event endpoint to trigger the downstream Dag in another deployment when an asset is updated. Typical asset implementation only works for Dags in the same Airflow deployment, but by using the Airflow REST API, you can implement this pattern across deployments.

The create_asset_event endpoint identifies the target asset by its integer asset_id, which Airflow assigns automatically when the asset is first referenced in the downstream deployment. This example looks up the asset_id at runtime by querying the assets endpoint with the asset’s URI, so you don’t need to hard-code the identifier.

Prerequisites

Process

  1. In your upstream Deployment, which is the Deployment for which you did not create an API Token, in the Deployment’s Environment Variables tab in your Deployment’s Environment settings, create an environment variable for your API token and use API_TOKEN for the key.
  2. For your downstream Deployment, follow the guidance in Make requests to the Airflow REST API - Step 2 to obtain the Deployment URL for your downstream Deployment. The Deployment URL should be in the format of clq52ag32000108i8e3v3acml.astronomer.run/dz3uu847.
  3. In your upstream Deployment, use Variables in the Astro UI to create an environment variable where you can store your downstream Deployment URL, using DEPLOYMENT_URL for the key.
  4. In the upstream Deployment, add the following Dag to your Astro project running in the upstream Deployment. The get_bear task declares MY_ASSET as an outlet, which produces an asset event for MY_ASSET in the same Airflow deployment on successful completion. The dependent update_asset_via_api task produces an asset event for the same-named asset in a different Airflow deployment by calling the Airflow REST API v2 create_asset_event endpoint.
1from airflow.sdk import Asset, dag, task
2from pendulum import datetime
3import os
4import requests
5
6URI = "file://include/bears"
7MY_ASSET = Asset(URI)
8TOKEN = os.environ.get("API_TOKEN")
9DEPLOYMENT_URL = os.environ.get("DEPLOYMENT_URL")
10
11@dag(
12 start_date=datetime(2023, 12, 1),
13 schedule="0 0 * * 0",
14 catchup=False,
15 doc_md=__doc__,
16)
17def producer_dag():
18 @task(outlets=[MY_ASSET])
19 def get_bear():
20 print("Update the bears asset")
21
22 @task
23 def update_asset_via_api():
24 headers = {
25 "Authorization": f"Bearer {TOKEN}",
26 "Content-Type": "application/json",
27 "Accept": "application/json",
28 }
29
30 # Look up the asset_id in the downstream deployment by URI.
31 # Airflow assigns the integer asset_id when the asset is first
32 # referenced in that deployment.
33 lookup = requests.get(
34 url=f"https://{DEPLOYMENT_URL}/api/v2/assets",
35 headers=headers,
36 params={"uri_pattern": URI},
37 )
38 lookup.raise_for_status()
39 assets = lookup.json().get("assets", [])
40 if not assets:
41 raise RuntimeError(
42 f"No asset with URI {URI} found in the downstream deployment."
43 )
44 asset_id = assets[0]["id"]
45
46 # Create an asset event in the downstream deployment.
47 payload = {
48 "asset_id": asset_id,
49 "extra": {},
50 }
51 response = requests.post(
52 url=f"https://{DEPLOYMENT_URL}/api/v2/assets/events",
53 headers=headers,
54 json=payload,
55 )
56 response.raise_for_status()
57 print(response.json())
58
59 get_bear() >> update_asset_via_api()
60
61producer_dag()
  1. Deploy the project to Astro.
  2. Deploy a Dag of any kind to your downstream Deployment. In this dag, use a dag_id of consumer_dag and schedule it on the same asset as the producer_dag. Deploying this Dag registers the asset in the downstream deployment and assigns it an asset_id that the upstream producer can look up at runtime.

For example:

1from airflow.sdk import Asset, dag, task
2from datetime import datetime
3
4URI = "file://include/bears"
5
6@dag(
7 dag_id="consumer_dag",
8 start_date=datetime(2023, 12, 1),
9 schedule=[Asset(URI)],
10 catchup=False,
11 doc_md=__doc__,
12)
13def consumer_dag():
14 @task
15 def wait_for_bears():
16 print("The bears are here!")
17
18 wait_for_bears()
19
20consumer_dag()

After deploying both projects to their respective Deployments on Astro, runs of your producer_dag trigger your consumer_dag automatically. If the downstream Dag isn’t firing, verify that the upstream Deployment’s environment variables, the API token and deployment URL, correspond to the downstream Deployment’s API Token and Deployment URL. A successful request to the Airflow REST API emits a payload containing the newly created asset event. Check the task logs for output that looks similar to the following:

[2025-06-02, 11:21:06 UTC] {logging_mixin.py:188} INFO - {'id': 4, 'asset_id': 1, 'uri': 'file://include/bears', 'extra': {'from_rest_api': True}, 'source_task_id': None, 'source_dag_id': None, 'source_run_id': None, 'source_map_index': -1, 'created_dagruns': [], 'timestamp': '2025-06-02T11:21:06.252976+00:00'}

See also