Configure AWS Parameter Store as a secrets backend on Astro Private Cloud
In this section, you’ll learn how to use AWS Systems Manager (SSM) Parameter Store as a secrets backend on Astro Private Cloud.
Prerequisites
- A Deployment.
- The Astro CLI.
- An Astro project initialized with
astro dev init
. - Access to AWS SSM Parameter Store.
- A valid AWS Access Key ID and Secret Access Key.
Step 1: Write an Airflow variable or connection to AWS Parameter Store
To start, add an Airflow variable or connection as a secret to Parameter Store for testing. For instructions, see the AWS documentation on how to do so using the AWS Systems Manager Console, the AWS CLI, or Tools for Windows PowerShell.
Variables and connections should live at /airflow/variables
and /airflow/connections
, respectively. For example, if you’re setting a secret variable with the key my_secret
, it should exist at /airflow/connections/my_secret
.
Step 2: Set up AWS Parameter Store locally
To test AWS Parameter Store locally, configure it as a secrets backend in your Astro project.
First, install the Airflow provider for Amazon by adding the following to your project’s requirements.txt
file:
Then, add the following environment variables to your project’s Dockerfile
:
In the next step, you’ll test that this configuration is valid locally.
<your-aws-key>
and <your-aws-secret-key>
in a secure manner. When you deploy to Astro Private Cloud, use the UI to set these values as secrets.If you’d like to reference an AWS profile, you can also add the profile
param to ENV AIRFLOW__SECRETS__BACKEND_KWARGS
.
To further customize the integration between Airflow and AWS SSM Parameter Store, reference Airflow documentation with the full list of available kwargs.
Step 3: Run an example dag to test AWS Parameter Store locally
To test Parameter Store, write a simple dag which calls your secret and add this dag to your Astro project’s dags
directory.
For example, you can use the following dag to print the value of an Airflow variable to your task logs:
You can do the same for any Airflow connection.
To test your changes:
-
Run
astro dev restart
to push your changes to your local Airflow environment. -
In the Airflow UI (
http://localhost:8080/admin/
), trigger your new dag. -
Click on
test-task
> View Logs. If you ran the example dag above, you should see the contents of your secret in the task logs:
Step 4: Deploy to Astro Private Cloud
Once you’ve confirmed that the integration with AWS SSM Parameter Store works locally, you can complete a similar set up with a Deployment on Astro Private Cloud.
- In the Astro Private Cloud UI, add the same environment variables found in your
Dockerfile
to your Deployment environment variables. Specify bothAWS_ACCESS_KEY_ID
andAWS_SECRET_ACCESS_KEY
as secret ensure that your credentials are stored securely. - In your Astro project, delete the environment variables from your
Dockerfile
. - Deploy your changes to Astro Private Cloud.
Now, any Airflow variable or connection that you write to AWS SSM Parameter Store can be automatically pulled by any dag in your Deployment on Astro Private Cloud.