In Airflow 2.4 the Datasets feature was introduced.
This allows data-aware scheduling:
- The DAG author (you) can tell Airflow that a task is updating a Dataset:
outlets=[Dataset(“s3://my_bucket”)] - DAGs can be scheduled to run on these updates to Datasets:
schedule=[Dataset(“s3://my_bucket”)]
You can find all the needed resources in this Github repository.
Get started free.
OR
API Access
Alerting
SAML-Based SSO
Airflow AI Assistant
Deployment Rollbacks
Audit Logging
By proceeding you agree to our Privacy Policy, our Website Terms and to receive emails from Astronomer.

