- Export task logs using a Vector DaemonSet
- Export logs using container sidecars
- Configure logging sidecars
- Customize Vector logging sidecars
- Use an external Elasticsearch instance for Airflow task log management
- Create an Elastic Deployment and endpoint
- Save your Elastic Cloud deployment credentials
- View Airflow task logs in Elastic
Configure task log collection and exporting to ElasticSearch
Airflow task logs are stored in a logging backend to ensure you can access them after your Pods terminate. By default, Astronomer uses Vector to collect task logs and export them to an ElasticSearch instance.
You can configure how Astronomer collects Deployment task logs and exports them to ElasticSearch. The following are the supported methods for exporting task logs to ElasticSearch:
- Using a Daemonset pod on each Kubernetes node in your cluster.
- Using container sidecars for Deployment components.
Export task logs using a Vector DaemonSet
Exporting task logs using a Vector Daemonset is not supported for Airflow 3.
By default, Astro Private Cloud uses a Vector DaemonSet to aggregate task logs. This is the workflow for the default implementation:
- Deployments write task logs to
stdout. - Kubernetes takes the output from
stdoutand writes it to the Deployment’s node. - A Vector Pod reads logs from the node and forwards them to ElasticSearch.
Astronomer recommends using Vector Daemonset for Organizations that:
- Run longer tasks using Celery executor.
- Run Astro Private Cloud in a dedicated cluster.
- Run privileged containers in a cluster with a ClusterRole.
This approach is not suited for Organizations that don’t allow logging container to run in priviledged mode and run many small tasks using the Kubernetes executor. Because task logs exist only for the lifetime of the Pod, your Pods running small tasks might complete before Vector can collect their task logs.
Export logs using container sidecars
You can use a logging sidecar container to collect and export logs. In this implementation:
- Each container running an Airflow component for a Deployment receives its own Vector sidecar.
- Task logs are written to a shared directory.
- The Vector sidecar reads logs from the shared directory and writes them to ElasticSearch.
This implementation is recommended for organizations that:
- Run Astro Private Cloud in a multi-tenant cluster, where security is a concern.
- Use the KubernetesExecutor to run many short-lived tasks, which requires improved reliability.
Configure logging sidecars
-
Retrieve your
values.yamlfile. See Apply a config change. -
Add the following entry to your
values.yamlfile:If you’re migrating from Fluentd, you must also set the following configuration so that Astro Private Cloud can retain logs:
-
Push the configuration change. See Apply a config change.
Customize Vector logging sidecars
You can customize the default Astronomer Vector logging sidecar to have different transformations and sinks based on your team’s requirements. This is useful if you want to annotate, customize, or filter your logs before sending them to your logging platform.
-
In the Astro UI, go to your Clusters page and select your cluster.
-
In the cluster details, click Edit in the Deployment Configuration section and add the override below in the Configuration Override field:
For more info on using the Configuration Override in the UI, see Override base configuration.
-
Save and apply your changes in the UI.
Pushing this change updates the configuration for the Cluster. Individual deployments will receive the new sidecar logging configuration once they are redeployed.
- Create a custom vector configuration
yamlfile to change how and where sidecars forward your logs. The following examples are template configurations for each commonly used external logging service. For the complete default logging sidecar configmap, see the Astronomer GitHub.
Elasticsearch
Honeycomb
Datadog
-
Run the following command to add the configuration file to your cluster as a Kubernetes secret:
-
Run the following command to annotate the secret so that it’s automatically applied to all new Deployments:
-
Run the following command to sync existing Deployments with the new configuration:
Use an external Elasticsearch instance for Airflow task log management
Add Airflow task logs from your Astronomer Deployment to an existing Elasticsearch instance on Elastic Cloud to centralize log management and analysis. Centralized log management allows you to quickly identify, troubleshoot, and resolve task failure issues. Although these examples use Elastic Cloud, you can also use AWS Managed OpenSearch Service or any other elastic service (managed or hosted). With an external Elasticsearch instance configured for Astro Private Cloud, you can see the logs in your Elasticsearch instance and browse the logs from the Astro Private Cloud UI.
Create an Elastic Deployment and endpoint
- In your browser, go to
https://cloud.elastic.co/and create a new Elastic Cloud deployment. See Create a deployment. - Copy and save your Elastic Cloud deployment credentials when the Save the deployment credentials screen appears.
- On the Elastic dashboard, click the Gear icon for your Deployment.
-
Click Copy endpoint next to Elasticsearch.
-
(Optional) Test the Elastic Cloud deployment endpoint:
- Open a new browser window, paste the endpoint you copied in step 4 in the Address bar, and then press Enter.
- Enter the username and password you copied in step 2 and click Sign in. Output similar to the following appears:
Save your Elastic Cloud deployment credentials
After you’ve created an Elastic deployment and endpoint, you have two options to store your Elastic deployment credentials. You can store the credentials in your Astro Private Cloud helm values, or for greater security, as a secret in your Astro Private Cloud Kubernetes cluster. For additional information about adding an Astro Private Cloud configuration change, see Apply a config change.
values.yaml
Kubernetes secret
- Run the following command to base64 encode your Elastic Cloud deployment credentials:
- Add the following entry to your
values.yamlfile:
- Add the following entry to your
values.yamlfile to disable internal logging:
- Run the following command to upgrade the Astro Private Cloud release version in the
values.yamlfile:
View Airflow task logs in Elastic
- On the Elastic dashboard in the Elasticsearch Service area, click the Deployment name.
-
Click Menu > Discover. The Create index pattern screen appears.
-
Enter
vector.*if you use Vector Sidecar Logging. In the Name field, enter@timestampin the Timestamp field, and then click Create index pattern. -
Click Menu > Dashboard to view all of the Airflow task logs for your Deployment on Astronomer.