WEBINAR

How to pass data between your Airflow tasks

Recorded On April 11, 2023

  • Tamara Fingerlin
  • Kenten Danas

This webinar provides a deep dive into passing data between your Airflow tasks. We cover everything you need to know to choose and implement a method for passing data based on your use case, infrastructure, scale, and more. Questions covered in this session include:

  • What are some best practices to follow when using XComs, Airflow’s built-in cross-communication utility?
  • How can I pass data between tasks and DAGs using TaskFlow and traditional operators?
  • How does the Astro Python SDK use XComs to move data between relational stores and Python data structures, and how can it simplify my pipelines?
  • How can I set up a custom XCom backend and implement custom serialization methods?
  • How can I pass data between tasks that are run in isolated environments like the KubernetesPodOperator?

All code covered in this webinar can be found in this repo.

See More Resources

Orchestrate next-generation AI tools: Explore six new Airflow providers

Efficient data quality checks with Airflow 2.7

Data Engineering Digest and Astronomer | The Ultimate Guide to DAG Writing

Best Practices for Writing DAGs in Airflow 2

Try Astro for Free for 14 Days

Sign up with your business email and get up to $500 in free credits.

Get Started

Build, run, & observe your data workflows. All in one place.

Build, run, & observe
your data workflows.
All in one place.

Try Astro today and get up to $500 in free credits during your 14-day trial.