Announcing Apache Airflow on Astronomer: Cloud Edition

Today, after just over a year of development, we are excited to announce the launch of Managed Apache Airflow on Astronomer!

The Problem with ETL

The past few years have brought a dramatic shift in the data landscape—as data gets bigger and faster, many organizations find themselves spending more time centralizing and preparing data sets for analytics than generating insights. In order to combat this, companies have invested heavily in extract, transform, load (or ETL) tools, but old-school drag-and-drop tools no longer cut it for savvy data teams who need a more flexible solution.

To satisfy this need for more customizability, AirBnb Data Engineer Maxime Beauchemin created and then open-sourced Airflow: a workflow management system that defines tasks and their dependencies as code and then distributes and executes them on a defined schedule. Built by developers, for developers, Airflow is based on the principle that ETL is best expressed in code.

Why we created Airflow on Astronomer

While Airflow is ambitious in design and vision, it can cause some headaches, particularly with respect to implementation, devops, and maintenance. Our team at Astronomer has built a Managed Airflow product to accomplish three major goals:

1. Secure deployment and scaling

Fast and secure deployment to a managed cloud environment and seamless horizontal scaling ensures that time is spent on writing data pipelines instead of dealing with infrastructure issues.

2. Provide a world-class developer experience:

Astronomer caters to the needs and desires of modern open-source developers with lightweight tools, a rich CLI and API, a locally mirrored dev environment and no requirement to use an inflexible and confusing drag-and-drop GUI.

3. Work with data anywhere:

Proliferation of SaaS silos and internal straddling of cloud/on-premise data environments means that data teams must be prepared to work with a variety of data in the cloud across technical and corporate barriers.

We’ve made getting started with Airflow dead-simple. To get started:

  1. Head over and sign up for our app
  2. Determine what plan is appropriate for your needs- Low Orbit, Interplanetary, or Deep Space Provision your instance of Airflow on Astronomer
  3. Start writing fully-customizable ETL pipelines in python

And if you don’t have any experience with Airflow but you want to investigate getting your team to get ramped up quickly, check out Spacecamp: An immersive guided development course in all things Airflow.

Also, tune into our Airflow Podcast to hear from some of the project’s top contributors and users.

If you have any questions, feel free to ping us in the webchat or at humans@astronomer.io!


Subscribe to RSS
Ready to build your data workflows with Airflow?

Astronomer is the data engineering platform built by developers for developers. Send data anywhere with automated Apache Airflow workflows, built in minutes...