Astronomer Webinars

Join us for upcoming online events!

Streamlining Data Pipelines with Sophi.io: An Airflow Journey

Sophi.io, a leading AI-powered content optimization platform, faced scale challenges as their data pipeline grew using Apache Airflow. Learn how Sophi optimized their data pipelines after migrating from Amazon Managed Workflows for Apache Airflow (MWAA) to Astronomer and dramatically improved their data pipeline scalability and reliability.

Register Now

Past Webinars

The New DAG Schedule Parameter

Live with Astronomer will discuss the new consolidated `schedule` parameter introduced in Airflow 2.4. We’ll provide a quick refresher of scheduling concepts and discuss how scheduling DAGs is easier and more powerful in newer versions of Airflow.

Continue Reading

Dynamic Tasks in Airflow

With the releases of Airflow 2.3 and 2.4, users can write DAGs that dynamically generate parallel tasks at runtime. In this webinar, we’ll cover everything you need to know to implement dynamic tasks in your DAGs.

Continue Reading

Data Driven Scheduling

In this session, Live with Astronomer explores the new datasets feature introduced in Airflow 2.4. We’ll show how DAGs that access the same data now have explicit, visible relationships, and how DAGs can be scheduled based on updates to these datasets.

Continue Reading

Data Transformations with the Astro Python SDK

On September 13, Live with Astronomer will dive into implementing data transformations with the Astro Python SDK. The Astro Python SDK is an open source Python package that allows for clean and rapid development on ELT workflows. We’ll show how you can use the transform and dataframe functions to easily transform your data using Python or SQL and seamlessly transition between the two.

Continue Reading

The Astro Python SDK Load File Function

The next Live with Astronomer will dive into the Astro Python SDK load_file function. The Astro Python SDK is an open source Python package that allows for clean and rapid development on ELT workflows. We’ll show how you can use load_file for the ‘Extract’ step of your pipeline to easily get data from your filesystems into your data warehouse, without any operator-specific knowledge.

Continue Reading