WEBINAR

Best practices for writing ETL and ELT pipelines

Recorded On October 17, 2024

  • Kenten Danas
  • Tamara Fingerlin

ETL and ELT are some of the most common data engineering use cases, and are considered the bread and butter of Airflow. Because Airflow is 100% code, knowing the basics of Python is all it takes to get started writing pipelines that provide the data your team needs for any downstream application. However, working with ETL and ELT pipelines in production often comes with challenges such as scaling, connectivity to other systems, and dynamically adapting to changing data sources. Fortunately, Airflow has all of this covered, and we’re here to help you learn how to make Airflow work best for this use case.In this webinar, we cover DAG writing best practices applicable to ETL and ELT pipelines. You can find the code shown in the demo here.

See More Resources

How to Orchestrate Machine Learning Workflows with Airflow

Anatomy of an Operator

2024 Apache Airflow®: Trends and Insights

4 Things To Consider When Deciding How to Run Airflow

Try Astro for Free for 14 Days

Sign up with your business email and get up to $500 in free credits.

Get Started

Build, run, & observe your data workflows. All in one place.

Build, run, & observe
your data workflows.
All in one place.

Try Astro today and get up to $500 in free credits during your 14-day trial.