Lead Developer Advocate
Using Airflow with Databricks is a common use case across the data ecosystem. With the recent release of Databricks Jobs, a new workflow engine powered by Databricks, this need for combining these tools has only increased. Recently, new open-source frameworks have been developed to help users leverage both Airflow’s flexibility and Databricks’ newer, less expensive Jobs clusters.
In this “Live with Astronomer” session, we’ll dive into Cosmos, an open-source framework developed by Astronomer for dynamically generating Airflow DAGs from other tools. We’ll show how you can use Cosmos Databricks to monitor your Databricks Jobs from Airflow, write tasks that run Databricks Jobs using the DAG API you’re familiar with, and even send repair requests to Databricks when tasks fail.