Write and run DAGs on Astro
Astro includes several features that enhance the Apache Airflow development experience, from DAG writing to testing. To use these features, you might need to modify how you write your DAGs and manage the rest of your code.
Use this documentation to learn about the key differences between managing DAGs on Astro versus on other platforms.
Project structure
To develop and run DAGs on Astronomer products, your DAGs must belong to an Astro project. An Astro project contains all files required to run your DAGs both locally and on Astronomer. In addition to DAG files, an Astro project includes Python and OS-level packages, your Astro Runtime version, and any other files your workers need to access when you run tasks.
See Create an Astro project to learn more about how to create and run Astro projects.
Airflow and Astro Runtime versioning
When you migrate to Astro from another Apache Airflow service, there are a few differences to note with regards to how Astro handles versioning, upgrades, and runtime builds.
- Astro Runtime versioning scheme: Each Astro project uses a specific version of Astro Runtime, which is Astronomer's version of Apache Airflow that includes additional observability and performance features. Each Astro Runtime version corresponds to one Apache Airflow version, but the versioning scheme is different. For example, Astro Runtime 11.3.0 corresponds to Airflow 2.9.1. See Astro Runtime maintenance policy.
- Upgrading Airflow: As you continue to develop within an Astro project, you'll need to upgrade your Astro Runtime version to take advantage of new Astro and Apache Airflow features and fixes. Your Astro Runtime version is defined in the
Dockerfile
of your Astro project. Unlike with open source Airflow, upgrading Astro Runtime does not require you to manually migrate your metadata database. To upgrade your version of Airflow, you only have to change the Astro Runtime version listed in your project's Dockerfile and rebuild your project. See Upgrade Astro Runtime for instructions on how to upgrade. - Runtime Arguments. Your Dockerfile is also where you can define additional runtime arguments that trigger whenever your project builds. You can use these arguments to mount resources, such as API tokens, to your Airflow environment without including the specific resources in your project files. See Customize your Dockerfile for more details.
Testing environments
There are two ways to run DAGs within the Astro ecosystem: Either in a local Airflow environment or on a hosted Astro Deployment. Each has its own purpose in the development lifecycle.
- Local Airflow environment: You can run DAGs on your local machine using the Astro CLI. This development method is most useful if you need to quickly iterate and test changes, such as when fixing a bug, or you're just getting started with Airflow. Testing locally is free and open source.
- Deployment: When you deploy your DAGs to Astro, they run on a managed Deployment. Use Deployments to run production code, or create a development Deployment to test changes over a longer period of time than you could in a local Airflow environment. To test code on a Deployment, you must have an Astro account and an administrator on your team must grant you access to the Deployment. For more information about how to structure Deployments for specific development workflows, see Connections and branch-based deploys.
Unit tests
Whether you're building your project locally or deploying to Astro, you can run unit tests with the Astro CLI to ensure that your code meets basic standards before you run your DAGs. The Astro CLI includes a default set of unit tests which you can run alongside your own tests in a single sequence. See Test your DAGs for more information.
Airflow feature enhancements
Astro includes several features that enhance open source Apache Airflow functionality.
- On Astro, the Astro UI renders Airflow tags defined in your DAGs. Use tags to filter DAGs across all Deployments from a single screen.
- The Astro Environment Manager allows you to create and manage Airflow connections directly from the Astro UI. Instead of being limited to defining connections in the Airflow UI or with a secrets manager, you can create connections from the Environment Manager on Astro and use the connections in your local Airflow environment or across multiple Deployments and Workspaces.
- Astro has built-in infrastructure to run the KubernetesPodOperator and Kubernetes executor, such as default Pod limits and requests. Task-level resource limits and requests are set by default on Astro Deployments, which means that tasks running in Kubernetes Pods never request more resources than expected. See Run the Kubernetes executor and Run the KubernetesPodOperator for more specific instructions and examples.
DAG observability
In local Airflow environments, you can use the Airflow UI to check your DAG runs, task logs, and component logs just as you would in any other Airflow environment.
In the Astro UI, you have access to the DAGs page in addition to the Airflow UI. From here, you can manage DAG and task runs for any Deployment in your Workspace.
Astronomer recommends using the UI that best fits the need your team. If you prefer managing your DAGs from a single place and find the way that the way the Astro page is designed helpful, you don't need to use the Airflow UI. If you're a longtime user of Apache Airflow, you might feel more comfortable in the Airflow UI and don't need to use the Astro UI.
See View logs and Manage DAG runs for more information.
Airflow alerts
Astro supports a set of alerting features that in many cases replace Apache Airflow SLAs or failure notification. There are some circumstances where Astronomer recommends configuring Astro alerts instead of Airflow SLAs or failure notifications because it can simplify your DAG code and make it easier to manage alerts across multiple DAGs. See When to use Airflow or Astro alerts for your pipelines on Astro.
Astronomer Cosmos
In addition to its commercial products, Astronomer maintains Cosmos an open source tool for orchestrating dbt Core projects from a DAG. Cosmos gives you more visibility into every step of your dbt project and lets you use Airflow's data awareness features with your dbt models. See the Cosmos documentation for more information.