How to Orchestrate Databricks Jobs Using Airflow

Watch Video On Demand

Hosted By

  • Kenten Danas
  • Daniel Imberman

This “Live with Astronomer” session covers how to use the Astro Databricks provider to orchestrate your Databricks Jobs from Airflow. This new provider allows you to monitor your Databricks Jobs from Airflow, write tasks that run Databricks Jobs using the DAG API you’re familiar with, and even send repair requests to Databricks when tasks fail.

Questions covered in this session include:

Learn more about the Astro Databricks Provider and see example code in the official repo.

Ready to Get Started?

See how your team can fuel its data workflows with more power and less complexity than ever before.

Start Free Trial →

Which plan works best for your team?

Learn about pricing →

What can Astronomer do for your organization?

Talk to an expert →