Skip to main content

Export logs to a Secondary S3 Bucket

Export logs to a Secondary S3 Bucket

By forwarding Airflow task logs from your Astro Deployment to an additional, customer‑managed S3 bucket, you can keep redundant copies of your execution history, integrate with existing log‑processing pipelines, and satisfy compliance or retention requirements that extend beyond the built‑in Astro logs.

Prerequisites

  • Your Deployment must run Astro Runtime 11.7.0 or later. See Upgrade Astro Runtime.
  • Your image must include astronomer-providers-logging==1.6.2 or later. The secondary S3 logging feature was introduced in this version.
  • You need an AWS account where you can create an S3 bucket and IAM resources.

Overview of configuration options

Astro supports two ways to grant your Deployment permissions to write to the secondary bucket:

  1. Customer‑provided Workload Identity – recommended if you already use this mechanism for other AWS services.
  2. Assume Role – allows your Deployment to assume a dedicated IAM role that has access to the bucket.

Both options require the same base environment variables:

VariableRequiredExample value
AIRFLOW__ASTRO_SECONDARY_LOGS__S3_BUCKET_ENABLEDtrue
AIRFLOW__ASTRO_SECONDARY_LOGS__S3_BASE_LOG_FOLDERs3://my‑bucket/logs
AIRFLOW__ASTRO_SECONDARY_LOGS__REGIONus‑east‑1 (omit if bucket is in the same region as the Deployment)

Option 1 : Customer‑Provided Workload Identity

If your Deployment already uses a customer‑managed AWS Workload Identity, you can attach S3 write permissions directly to that role.

  1. Configure a Workload Identity for the Deployment if you don’t have one yet.
  2. Grant that identity permissions to write to the bucket. Example policy:
    "Version": "2012-10-17",
"Statement": [
{
"Sid": "ExternalS3Policy",
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:ListBucket",
"s3:DeleteObject"
],
"Resource": [
"arn:aws:s3:::my-bucket",
"arn:aws:s3:::my-bucket/*",
]
}
]
}
  1. Set the following environment variables in the Deployment:
AIRFLOW__ASTRO_SECONDARY_LOGS__S3_BUCKET_ENABLED=true
AIRFLOW__ASTRO_SECONDARY_LOGS__S3_BASE_LOG_FOLDER="s3://my-bucket/logs"
# Optional if bucket is in the same region as the deployment
AIRFLOW__ASTRO_SECONDARY_LOGS__REGION="us-east-1"

Option 2: Assume Role

With this approach, your Deployment’s existing identity assumes a separate IAM role that has write access to the bucket.

Additional variable

VariableRequiredExample value
AIRFLOW__ASTRO_SECONDARY_LOGS__ROLE_ARNarn:aws:iam::123456789012:role/SecondaryS3LoggerRole
  1. Create an IAM role with S3 write permissions (same JSON policy as above).
  2. Configure the role’s trust relationship to allow your Deployment to assume it:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::111122223333:role/astro-deployment-role"
},
"Action": "sts:AssumeRole"
}
]
}
  1. Set the environment variables:
AIRFLOW__ASTRO_SECONDARY_LOGS__S3_BUCKET_ENABLED=true
AIRFLOW__ASTRO_SECONDARY_LOGS__S3_BASE_LOG_FOLDER="s3://my-bucket/logs"
AIRFLOW__ASTRO_SECONDARY_LOGS__ROLE_ARN="arn:aws:iam::123456789012:role/SecondaryS3LoggerRole"
# Optional if bucket is in the same region
AIRFLOW__ASTRO_SECONDARY_LOGS__REGION="us-east-1"

Troubleshooting

If task logs are not appearing in the bucket:

  1. Confirm that all required environment variables are set exactly as documented.
  2. Verify that the IAM role attached to the Deployment, or the assumed role, includes s3:PutObject for the bucket path.
  3. Check the role’s trust relationship (Assume Role options only).
  4. Inspect the deployment worker logs for external S3 logging-related errors. Ensure that the Worker is selected in the dropdown - the default is Scheduler

If issues persist, contact Astronomer Support with the Deployment ID and any relevant error output.

Was this page helpful?