ETL is one of the most common data engineering use cases, and it's one where Airflow really shines.
In this webinar, we cover everything you need to get started as a new Airflow user, and dive into how to implement ETL pipelines as Airflow DAGs using Snowflake.
If you’re new to Apache Airflow, check out the articles where we walk you through all the core concepts and components.
You can also watch our previous Intro to Airflow Webinar.
Having your data pipelines in code gives you great flexibility. While using provider packagesand external Airflow services you can implement a complex pipeline in Airflow (as shown in the example below) without writing a lot of code.
In today’s webinar, we’re focusing on the Snowflake provider.