Run, monitor, and manage Airflow on Azure with Astronomer


Get Started

Trusted By

SonosElectronic ArtsSweetGreenConde NastStockXCredit SuisseRappi

Deploy Apache Airflow on Azure Kubernetes Services

Astronomer allows you to easily run, monitor, and scale Apache Airflow deployments on Azure. Whether you’re looking to move your existing ETL and ML pipelines to Airflow or just want a more stable AKS cluster to run your data pipelines on, we’ve got you covered.

  1. Seamless developer experience- Astronomer makes it easy to spin up, deploy code to, and configure isolated Airflow environments on your AKS cluster.
  2. DevOps-friendly experience- Astronomer includes a full Prometheus and Grafana monitoring stack, user permission control, and a flexible ElasticSearch/Fluentd/Kibana logging system for full-text log search.
  3. Enterprise-grade support and access to the leading Apache Airflow experts

Why run Apache Airflow with Azure?

We believe that anything connected to data should be developer-led. Apache Airflow is a battle-tested solution for data orchestration used by thousands of companies around the world. Thanks to the vibrant open-source community surrounding the project, you can benefit from their support and experience instead of reinventing the wheel.

If your company deals with multiple pipelines across multiple teams Apache Airflow can help you:

Using ADF with Airflow?

If you’re using Azure Data Factory (ADF) for data orchestration, you can still benefit from Apache Airflow by combining the tools. On its own, ADF comes with certain disadvantages, like:

With Apache Airflow you can broaden your Azure experience. ADF jobs can be run using an Airflow DAG, giving the full capabilities of Airflow orchestration beyond using the Azure service alone. This allows users that are comfortable with ADF to write their jobs there, while Airflow acts as the control plane for orchestration.

If you want to learn more about using Airflow with ADF check out our webinar!

Apache Airflow
Monthly Downloads

Apache Airflow is the de facto standard for expressing data flows as code, with a robust and growing community of data engineers, scientists, and analysts across the world.

Learn more about Airflow