Authenticate Astro to AWS
Prerequisites
- A user account on AWS with access to AWS cloud resources.
- The AWS CLI.
- The Astro CLI.
- An Astro project.
Retrieve AWS user credentials locally
Run the following command to obtain your user credentials locally:
This command prompts you for your Access Key Id, Secret Access Key, Region, and output format. If you log into AWS using single sign-on (SSO), run aws configure sso
instead.
The AWS CLI then stores your credentials in two separate files:
.aws/config
.aws/credentials
The location of these files depends on your operating system:
- Linux:
/home/<username>/.aws
- Mac:
/Users/<username>/.aws
- Windows:
%UserProfile%/.aws
Configure your Astro project
The Astro CLI runs Airflow in a Docker-based environment. To give Airflow access to your credential files, you’ll mount the .aws
folder as a volume in Docker.
- In your Astro project, create a file named
docker-compose.override.yml
with the following configuration:
Mac
Linux
Windows
Depending on your Docker configurations, you might have to make your .aws
folder accessible to Docker. To do this, open Preferences in Docker Desktop and go to Resources → File Sharing. Add the full path of your .aws
folder to the list of shared folders.
- In your Astro project’s
.env
file, add the following environment variables. Make sure that the volume path is the same as the one you configured in thedocker-compose.override.yml
.
When you run Airflow locally, all AWS connections without defined credentials automatically fall back to your user credentials when connecting to AWS. Airflow applies and overrides user credentials for AWS connections in the following order:
- Mounted user credentials in the
~/.aws/config
file. - Configurations in
aws_access_key_id
,aws_secret_access_key
, andaws_session_token
. - An explicit username & password provided in the connection.
For example, if you completed the configuration in this document and then created a new AWS connection with its own username and password, Airflow would use those credentials instead of the credentials in ~/.aws/config
.
Test your credentials with a secrets backend
Now that Airflow has access to your user credentials, you can use them to connect to your cloud services. Use the following example setup to test your credentials by pulling values from different secrets backends.
-
Create a secret for an Airflow variable or connection in AWS Secrets Manager. All Airflow variables and connection keys must be prefixed with the following strings respectively:
airflow/variables/<my_variable_name>
airflow/connections/<my_connection_id>
For example when adding the secret variable
my_secret_var
you will need to give the secret the nameairflow/variables/my_secret_var
.When setting the secret type, choose
Other type of secret
and select thePlaintext
option. If you’re creating a connection URI or a non-dict variable as a secret, remove the brackets and quotations that are pre-populated in the plaintext field. -
Add the following environment variables to your Astro project
.env
file. For additional configuration options, see the Apache Airflow documentation. Make sure to specify yourregion_name
. -
Run the following command to start Airflow locally:
-
Access the Airflow UI at
localhost:8080
and create an Airflow AWS connection namedaws_standard
with no credentials. See Connections.When you use this connection in your dag, it will fall back to using your configured user credentials.
-
Add a dag which uses the secrets backend to your Astro project
dags
directory. You can use the following example dag to retrieve<my_variable_name>
and<my_connection_id>
from the secrets backend and print it to the terminal: -
In the Airflow UI, unpause your dag and click Play to trigger a dag run.
-
View logs for your dag run. If the connection was successful, your masked secrets appear in your logs. See Airflow logging.