Set up Hashicorp Vault as your secrets backend
This topic provides steps for using Hashicorp Vault as a secrets backend for both local development and on Astro.
To do this, you will:
- Create an AppRole in Vault which grants Astro minimal required permissions.
- Write a test Airflow variable or connection as a secret to your Vault server.
- Configure your Astro project to pull the secret from Vault.
- Test the backend in a local environment.
- Deploy your changes to Astro.
If you use a different secrets backend tool or want to learn the general approach on how to integrate one, see Configure a Secrets Backend.
Prerequisites
- A Deployment on Astro.
- The Astro CLI.
- A local or hosted Vault server. See Starting the Server or Create a Vault Cluster on HCP.
- An Astro project.
- The Vault CLI.
- Your Vault Server’s URL. If you’re using a local server, this should be
http://127.0.0.1:8200/
. - (Remote Execution Only) Helm installed
- (Remote Execution Only) The
values.yaml
file from the Register Agents modal in your Deployments>Agents page.
If you do not already have a Vault server deployed but would like to test this feature, Astronomer recommends that you either:
- Sign up for a Vault trial on Hashicorp Cloud Platform (HCP) or
- Deploy a local Vault server. See Starting the server in Hashicorp documentation.
Step 1: Create a Policy and AppRole in Vault
To use Vault as a secrets backend, Astronomer recommends configuring a Vault AppRole with a policy that grants only the minimum necessary permissions for Astro. For Remote Execution Deployments, you can use any Vault authentication method you prefer, for example Kubernetes auth if your agents and Vault are running on Kubernetes.
To do this:
-
Run the following command to create a Vault policy that Astro can use to access a Vault server:
-
Run the following command to create a Vault AppRole:
-
Run the following command to retrieve the
secret-id
for your AppRole:Save this value. You’ll use this later to complete the setup.
Step 2: Create an Airflow variable or connection in Vault
To start, create an Airflow variable or connection in Vault that you want to store as a secret. It can be either a real or test value. You will use this secret to test your backend’s functionality.
You can use an existing mount point or create a new one to store your Airflow connections and variables. For example, to create a new mount point called airflow
, run the following Vault CLI command:
To store an Airflow variable in Vault as a secret at the path variables
, run the following Vault CLI command with your own values:
To store an Airflow connection in Vault as a secret at the path connections
, first format the connection as a URI. Then, run the following Vault CLI command with your own values:
To format existing connections in URI format, see Import and export connections.
value
for all Airflow variables and the key name conn_uri
for all Airflow connections as shown in the previous commands.To confirm that your secret was written to Vault successfully, run:
Step 3: Set up Vault locally
Astro
Remote Execution
In your Astro project, add the Hashicorp Airflow provider to your project by adding the following to your requirements.txt
file:
Then, add the following environment variables to your .env
file:
If you run Vault on Hashicorp Cloud Platform (HCP):
- Replace
http://host.docker.internal:8200
withhttps://<your-cluster>.hashicorp.cloud:8200
. - Add
"namespace": "admin"
as an argument afterurl
.
This tells Airflow to look for variable and connection information at the airflow/variables/*
and airflow/connections/*
paths in your Vault server. You can now run a dag locally to check that your variables are accessible using Variable.get("<your-variable-key>")
.
For more information on the Airflow provider for Hashicorp Vault and how to further customize your integration, see the Apache Airflow documentation.
Step 4: Deploy configuration
Astro
Remote Execution
-
Run the following commands to export your environment variables to Astro:
-
Run the following command to push your updated
requirements.txt
file to Astro: -
(Optional) Remove the environment variables from your
.env
file or store your.env
file in a safe location to protect your credentials inAIRFLOW__SECRETS__BACKEND_KWARGS
.
Now, any Airflow variable or connection that you write to your Vault server can be successfully accessed and pulled by any dag in your Deployment on Astro.