WEBINAR

How to Orchestrate Databricks Jobs Using Airflow

Recorded On March 21, 2023

  • Kenten Danas
  • Daniel Imberman

This “Live with Astronomer” session covers how to use the Astro Databricks provider to orchestrate your Databricks Jobs from Airflow. This new provider allows you to monitor your Databricks Jobs from Airflow, write tasks that run Databricks Jobs using the DAG API you’re familiar with, and even send repair requests to Databricks when tasks fail.

Questions covered in this session include:

  • How is the Astro Databricks provider different from the original Airflow Databricks provider?
  • How do I install the Astro Databricks provider?
  • How can I use the DatabricksNotebookOperator to run and monitor Databricks Jobs from Airflow?
  • How does the Astro Databricks provider help me recover from failures in my Databricks Jobs?

Learn more about the Astro Databricks Provider and see example code in the official repo.

See More Resources

Implementing data quality checks with the Astro SDK

How to monitor your pipelines with Airflow and Astro alerts

Q2 2024 Astro Release: Seamlessly Integrate dbt with Astro

Airflow at Faire: Democratizing Machine Learning at Scale

Try Astro for Free for 14 Days

Sign up with your business email and get up to $500 in free credits.

Get Started

Build, run, & observe your data workflows. All in one place.

Build, run, & observe
your data workflows.
All in one place.

Try Astro today and get up to $500 in free credits during your 14-day trial.