Save Your Spot Today
What is Airflow? Apache Airflow is a platform used to programmatically author, schedule, and monitor data pipelines.
Born inside Airbnb, open-sourced, and graduated to a Top-Level Apache Software Foundation Project, Airflow is used by thousands of companies around the globe as the unbiased data control plane, translating business rules to power the data processing fabric of organizations.
In this webinar we’ll cover:
- The benefits of pipelines as code, and how to define your pipelines using Python.
- The core components of Airflow, and how they can be leveraged to make Airflow infinitely scalable.
- Core concepts in Airflow, including DAGs, tasks, and operators.
- The vast network of provider packages and how they make Airflow highly extensible.
- Ways to engage with and benefit from Airflow’s large, vibrant OSS community.
Kenten Danas - Lead Developer Advocate at Astronomer
Kenten is a Lead Developer Advocate at Astronomer, with a background in field engineering, data engineering, and consulting. She has first-hand experience adopting and running Airflow as a consultant, and is passionate about helping other data engineers scale and get the most out of their Airflow experience.
Viraj Parekh - Field CTO
Viraj is Field CTO at Astronomer, where he works to help ensure that Airflow and Astronomer have a great story within the greater data ecosystem. He’s based in Brooklyn and is always eager to talk about anything related to Knicks basketball.
Astronomer Webinars are biweekly, real-time online sessions for data pipeline authors hosted by Astronomer’s Apache Airflow experts. During an hour-long meeting, participants have a chance to dive into the most important features and practices related to Apache Airflow and data orchestration — from Airflow 2+ feature highlights to DAG writing best practices. At the end of each webinar, we open the floor for a Q&A to ensure that participants leave the event confident about their newly acquired knowledge.
Lead Developer Advocate