Intro to Kubectl


Installing Kubectl is pretty simple.


If you're running Ubuntu or Debian, install with the native package manager:

apt-get update && apt-get install -y apt-transport-https
curl -s | apt-key add -
cat <<EOF >/etc/apt/sources.list.d/kubernetes.list
deb kubernetes-xenial main
apt-get update
apt-get install -y kubectl


Install Kubectl using Homebrew on mac

brew install kubectl

Verify kubectl is installed and up-to-date

kubectl version


Install kubectl on windows using cURL

curl -LO

Add the binary to your environment PATH and you're all set.

For more installation options, visit the official Kubectl install guide:

Debugging Astronomer Airflow with kubectl.

Note: Kubectl control access is only available to Astronomer Enterprise Users. Check with your sys-admin to be sure that your gcloud account has the right permissions to use this.

To authenticate, run:

gcloud container clusters get-credentials CLUSTERNAME --zone ZONE --project PROJECT NAME


Kubectx Download kubectx for an easy way to switch between namespaces and clusters. This will prevent you from having to specify a namespace with each command.

Basic Commands

kubectl get pods

This will return a list of pods and their current status.

To delete a pod, run:

kubectl delete po/POD_NAME

To restart any particular component of your Airflow setup, you can simply delete the pod and it will spin back up. Note: Do not delete the database pod

If you are seeing unexpected behavior in your Airflow deployment, the answer might lie in the scheduler or webserver logs:

kubectl logs po/POD_NAME -f

This will follow the logs on your terminal. The scheduler and webserver logs tend to pile up quickly, so it might be best to run this after restarting either of those pods.

To exec into a pod, you can run:

kubectl exec -it NAME /bin/bash

This will show you the code that exists on the container.

Subscribe to RSS
Ready to build your data workflows with Airflow?

Astronomer is the data engineering platform built by developers for developers. Send data anywhere with automated Apache Airflow workflows, built in minutes...