
Save Your Spot Today
Getting started with Airflow, the de facto standard for data orchestration, has never been easier. With a combination of new Airflow features, open-source community projects, and Astro products, you can create complex data pipelines that leverage Airflow’s best-in-class orchestration while writing less code and without extensive Airflow knowledge.
In this workshop, we’ll take you from running Airflow, to writing your first DAG, to implementing ELT use cases with simple and complex datasets. In just a couple of hours, you’ll learn:
- The easiest ways to run Airflow, either locally using the open-source Astro CLI or using GitHub Codespaces for a zero-installation option.
- How to express complex, dynamic use cases in your DAGs using new Airflow features.
- How to write an ELT DAG — with no boilerplate code — using the Astro Python SDK, a new open-source SDK designed for rapid development of ETL/ELT workflows.
- How Astronomer is making DAG writing easier, including with the Astro Cloud IDE, a notebook-inspired way to write data pipelines.