Set environment variables on Astro Private Cloud

You can use environment variables on Astronomer to set both Airflow configurations or custom values, which Astro then applies to your Airflow Deployment either locally or in your Astro environment.

Environment variables can be used to set any of the following:

  • SMTP to enable email alerts
  • Airflow parallelism and dag concurrency
  • A secrets backend to manage your Airflow connections and variables
  • Store Airflow connections and variables
  • Customize your default dag view in the Airflow UI (Tree, Graph, Gantt etc.)

This guide covers the following:

  • How to set environment variables on Astronomer
  • How environment variables are stored on Astronomer
  • How to store Airflow connections and variables as environment variables

Set environment variables on Astronomer

On Astronomer, there are 3 ways to set environment variables:

  • via your .env file (Local Only)
  • via your Dockerfile
  • via the Astro Private Cloud UI

Read the following instructions on how to configure them.

While environment variables on Astronomer are the equivalent of updating your airflow.cfg, you can’t bring your own airflow.cfg file on Astronomer and configure it directly.

Using .env (Local Only)

You can use the Astro CLI to import environment variables from the .env file that astro dev init automatically generates when you initiate an Astro project.

To add environment variables locally,

  1. Find your .env file in your Astro project directory
  2. Add your environment variables of choice to that .env file
  3. Rebuild your image to apply those changes by running astro dev start --env .env

In your .env file, insert the environment variable value and key, ensuring all-caps for all characters. For example:

AIRFLOW__CORE__DAG_CONCURRENCY=5
If your environment variables contain secrets you don’t want to expose in plain-text, you can add your .env file to .gitignore if and when you deploy these changes to your version control tool.

Confirm your environment variables were applied

By default, Airflow environment variables are hidden in the Airflow UI for local environments. To confirm your environment variables in the Airflow UI for a local environment, set AIRFLOW__WEBSERVER__EXPOSE_CONFIG=True in either your Dockerfile or .env file (local only).

Alternatively, you can run:

docker ps

This outputs three Docker containers that run Airflow’s primary components on your machine: The Airflow scheduler, API server, and Postgres metadata database.

Now, create a Bash session in your scheduler container by running:

1docker exec -it <scheduler-container-name> /bin/bash

If you run ls -1 following this command, it returns a list of running files:

1bash-5.0$ ls -1
2Dockerfile airflow.cfg airflow_settings.yaml dags include logs packages.txt plugins requirements.txt unittests.cfg

Now, run:

1env

This prints all environment variables that are running locally, some of which are set by you and some of which are set by Astronomer by default.

You can also run cat airflow.cfg to output all contents in that file.

Multiple .env files

The CLI looks for .env by default. But, if you want to specify multiple files, make .env a top-level directory and create sub-files within that folder.

Your project might look like the following:

my_project
├── Dockerfile
├── dags
│ └── my_dag
├── plugins
│ └── my_plugin
├── airflow_settings.yaml
└── .env
├── dev.env
└── prod.env

Using your Dockerfile

If you work on an Astro project locally, but intend to deploy to Astronomer and want to commit your environment variables to your source control tool, you can set environment variables in your Dockerfile. This file was automatically created when you first initialized your Astro project on Astronomer (via astro dev init).

Because you commit the Dockerfile upstream, Astronomer strongly recommends withholding environment variables that contain sensitive credentials, and instead inserting them via your .env file locally, while adding the Dockerfile to your .gitignore. Or, you can set environment variables as ‘secret’ via the Astro Private Cloud UI, as described in Using the Astro Private Cloud UI.

To add environment variables, insert the value and key in your Dockerfile beginning with ENV, ensuring all-caps for all characters. With your Airflow image commonly referenced as a “FROM” statement at the top, your Dockerfile might look like this:

1FROM quay.io/astronomer/astro-runtime:7.1.0
2ENV AIRFLOW__CORE__MAX_ACTIVE_RUNS_PER_DAG=1
3ENV AIRFLOW__CORE__DAG_CONCURRENCY=5
4ENV AIRFLOW__CORE__PARALLELISM=25

After you add your environment variables:,

  • Run $ astro dev stop and $ astro dev start to rebuild your image and apply your changes locally OR
  • Run $ astro deploy to apply your changes to your running Airflow Deployment on Astronomer

Environment variables injected via the Dockerfile are mounted at build time and can be referenced in any other processes run during the docker build process that immediately follows astro deploy or astro dev start.

Environment variables applied via the Astro Private Cloud UI only become available after the docker build process has been completed.

Using the Astro Private Cloud UI

You can also add environment variables via the Astro Private Cloud UI. For environment variables that you only need on Astronomer and not locally, Astronomer recommends using this method.

  1. Navigate to the Astro Private Cloud UI
  2. Go to Deployment > Variables
  3. Add your environment variables.

Input for all configurations officially supported by Airflow are pre-templated, but you can also specify your own values.

Mark environment variables as secret

On Astronomer, you can mark any environment variable as Secret in the UI. If your environment variables contain potentially sensitive information, like SMTP password or your S3 bucket information, Astronomer recommends leveraging this feature.

In the Variables Tab:

  1. Enter a Key
  2. Enter a Value
  3. Check Secret?
  4. Click Add
  5. Select Deploy Changes

After Astro deploys changes, environment variables marked as Secret are NOT available in plain-text to any user in the Workspace.

A few additional notes:

  • Workspace Editors and Admins can set an existing non-secret Env Var to Secret at any time
  • To convert a Secret Env Var to a non-secret Env Var, Astro prompts you to enter a new value
  • If you export environment variables via JSON, Secret values will NOT render in plain-text
  • You cannot add a new variable that has the same key as an existing variable

The following sections provide more detail about how environment variables are encrypted on Astronomer.

Workspace roles and permissions apply to actions in the Variables tab. For a full breakdown of permissions for each role, reference Astronomer’s Roles and Permissions.

Precedence between methods

Because you can set environment variables using multiple different methods, potentially simultaneously, Astro applies a precedence to each.

Astronomer applies and overrides environment variables in the following order:

  1. Astro Private Cloud UI
  2. .env (Local Only)
  3. Dockerfile
  4. Default Airflow Values (airflow.cfg)

If you set AIRFLOW__CORE__PARALLELISM with one value via the Astro Private Cloud UI, and you set the same Environment Variable with another value in your Dockerfile, the value set in the Astro Private Cloud UI takes precedence.

How Astronomer stores environment variables

APC stores all values for environment variables that you add via the UI as a Kubernetes Secret, which is encrypted at rest and mounted to your Deployment’s Airflow pods (scheduler, API server or Webserver, and workers) as soon as they’re set or changed.

Environment variables are not stored in Airflow’s metadata Database and are not stored in Astronomer’s platform database. Unlike other components, the Astronomer Houston API fetches environment variables from the Kubernetes Secret instead of the platform’s database to render them in the Astro Private Cloud UI.

Adding Airflow connections and variables as environment variables

If you regularly leverage Airflow connections and variables, Astronomer recommends storing and fetching them using environment variables.

Airflow connections and variables are stored in Airflow’s metadata database. Adding them outside of task definitions and operators requires an additional connection to Airflow’s Postgres database, which is called every time the scheduler parses a dag (as defined by processor_poll_interval, which is set to 1 second by default). By adding connections and variables as environment variables, you can refer to them more easily in your code and lower the amount of open connections, preventing a strain on your database and resources.

Airflow connections

The Environment Variable naming convention for Airflow connections is:

ENV AIRFLOW_CONN_<CONN_ID>=<connection-uri>

Using the following Airflow connection example:

  • Connection ID: MY_PROD_DB
  • Connection URI: my-conn-type://login:password@host:5432/schema

The full environment variable reads:

ENV AIRFLOW_CONN_MY_PROD_DB=my-conn-type://login:password@host:5432/schema

You can set this environment variable via an .env file locally, via your Dockerfile, or via the Astro Private Cloud UI. For more information on how to generate your Connection URI, refer to Airflow’s documentation.

Airflow variables

The environment variable naming convention for Airflow variables is:

ENV AIRFLOW_VAR_<VAR_NAME>=Value

For the following Airflow variable example:

  • Variable Name: My_Var
  • Value: 2

This environment variable reads:

ENV AIRFLOW_VAR_MY_VAR=2