ETL and ELT are some of the most common data engineering use cases, and are considered the bread and butter of Airflow. Because Airflow is 100% code, knowing the basics of Python is all it takes to get started writing pipelines that provide the data your team needs for any downstream application. However, working with ETL and ELT pipelines in production often comes with challenges such as scaling, connectivity to other systems, and dynamically adapting to changing data sources. Fortunately, Airflow has all of this covered, and we’re here to help you learn how to make Airflow work best for this use case.
In this webinar, we’ll cover DAG writing best practices applicable to ETL and ELT pipelines, including things like:
- How to approach building ETL and ELT pipelines from scratch.
- Common DAG patterns for ETL and ELT use cases.
- Key Airflow features to create reliable ETL and ELT pipelines.
- How to make your DAGs dynamic in an efficient and scalable way.
Hosted By