WEBINAR

How to pass data between your Airflow tasks

Recorded On April 11, 2023

  • Tamara Fingerlin
  • Kenten Danas

This webinar provides a deep dive into passing data between your Airflow tasks. We cover everything you need to know to choose and implement a method for passing data based on your use case, infrastructure, scale, and more. Questions covered in this session include:

  • What are some best practices to follow when using XComs, Airflow’s built-in cross-communication utility?
  • How can I pass data between tasks and DAGs using TaskFlow and traditional operators?
  • How does the Astro Python SDK use XComs to move data between relational stores and Python data structures, and how can it simplify my pipelines?
  • How can I set up a custom XCom backend and implement custom serialization methods?
  • How can I pass data between tasks that are run in isolated environments like the KubernetesPodOperator?

All code covered in this webinar can be found in this repo.

See More Resources

From Zero to Production: Orchestrating LLM workflows with the Airflow AI SDK

The New DAG Schedule Parameter

How to build reliable data products with Astro

Data Quality Use Cases with Airflow and Great Expectations

Try Astro for Free for 14 Days

Sign up with your business email and get up to $500 in free credits.

Get Started

Build, run, & observe your data workflows. All in one place.

Build, run, & observe
your data workflows.
All in one place.

Try Astro today and get up to $500 in free credits during your 14-day trial.