Configure logging

Remote Execution Agents generate task logs in your Kubernetes cluster. By default, these logs remain in agent Pods and are lost when Pods terminate. Configure logging to preserve and access these logs.

Airflow 3
This feature is only available for Airflow 3.x Deployments.

Logging approaches

Export logs to your logging platform (Splunk, Elasticsearch, CloudWatch, etc.) using a logging sidecar. Configure the Airflow UI to display links to external logs.

Object storage (Airflow UI display)

Store logs in object storage (S3, GCS, Azure Blob) and configure the Astro API server to fetch and display them in the Airflow UI. Logs appear after task completion.

Real-time streaming to object storage

Extend object storage logging with a Vector sidecar to stream partial logs while tasks run. Provides real-time log visibility in the Airflow UI.

Comparison

FeatureExternal LoggingObject StorageReal-Time Streaming
Data locationYour logging platformYour object storageYour object storage
UI experienceLink to external platformView in Airflow UIView in Airflow UI (live)
Log availabilityNear real-timeAfter task completionDuring task execution
Setup complexityMediumMediumHigh
Storage costsPlatform-dependentStandard object storageHigher (many small files)

Prerequisites

  • Remote Execution Agent installed and registered
  • Workload identity configured for your Kubernetes cluster
  • One of the following:
    • External logging: Logging platform endpoint
    • Object storage: S3, GCS, or Azure Blob container
    • Real-time streaming: Object storage + Vector configuration

Next steps