Orchestrate Ray jobs with Apache Airflow®

Ray is an open-source framework for scaling Python applications, particularly for machine learning and AI workloads where it provides the layer for parallel processing and distributed computing. Many large language models (LLMs) are trained using Ray, including OpenAI’s GPT models.

The Ray provider package for Apache Airflow® allows you to interact with Ray from your Airflow DAGs. This tutorial demonstrates how to use the Ray provider package to orchestrate a simple Ray job with Airflow in an existing Ray cluster. For more in-depth information, see the Ray provider documentation.

For instructions on how to run Ray jobs on the Anyscale platform with Airflow, see the Orchestrate Ray jobs on Anyscale with Apache Airflow® tutorial.

This tutorial shows a simple implementation of the Ray provider package. For a more complex example, see the Processing User Feedback: an LLM-fine-tuning reference architecture with Ray on Anyscale reference architecture.

Time to complete

This tutorial takes approximately 30 minutes to complete.

Assumed knowledge

To get the most out of this tutorial, make sure you have an understanding of:

Prerequisites

The Ray provider package can also create a Ray cluster for you in an existing Kubernetes cluster. For more information, see the Ray provider package documentation. Note that you need a Kubernetes cluster with a pre-configured LoadBalancer service to use the Ray provider package.

Step 1: Configure your Astro project

Use the Astro CLI to create and run an Airflow project on your local machine.

  1. Create a new Astro project:

    1$ mkdir astro-ray-tutorial && cd astro-ray-tutorial
    2$ astro dev init
  2. In the requirements.txt file, add the Ray provider.

    astro-provider-ray==0.2.1
  3. Run the following command to start your Airflow project:

    1astro dev start

Step 2: Configure a Ray connection

For Astro customers, Astronomer recommends to take advantage of the Astro Environment Manager to store connections in an Astro-managed secrets backend. These connections can be shared across multiple deployed and local Airflow environments. See Manage Astro connections in branch-based deploy workflows.

  1. In the Airflow UI, go to Admin -> Connections and click +.

  2. Create a new connection and choose the Ray connection type. Enter the following information:

    • Connection ID: ray_conn
    • Ray dashboard url: Your Ray dashboard URL, for example http://kind-control-plane//:8265.
    • For this local tutorial, click the Disable SSL checkbox.
  3. Click Save.

If you are connecting to a Ray cluster running on a cloud provider, you need to provide the .kubeconfig file of the Kubernetes cluster where the Ray cluster is running as Kube config (JSON format), as well as valid Cloud credentials as environment variables.

Step 3: Write a DAG to orchestrate Ray jobs

  1. Create a new file in your dags directory called ray_tutorial.py.

  2. Copy and paste the code below into the file:

    1"""
    2## Ray Tutorial
    3
    4This tutorial demonstrates how to use the Ray provider in Airflow to parallelize
    5a task using Ray.
    6"""
    7
    8from airflow.decorators import dag, task
    9from ray_provider.decorators.ray import ray
    10
    11CONN_ID = "ray_conn"
    12RAY_TASK_CONFIG = {
    13 "conn_id": CONN_ID,
    14 "num_cpus": 1,
    15 "num_gpus": 0,
    16 "memory": 0,
    17 "poll_interval": 5,
    18}
    19
    20
    21@dag(
    22 start_date=None,
    23 schedule=None,
    24 catchup=False,
    25 tags=["ray", "example"],
    26 doc_md=__doc__,
    27)
    28def test_taskflow():
    29
    30 @task
    31 def generate_data() -> list:
    32 """
    33 Generate sample data
    34 Returns:
    35 list: List of integers
    36 """
    37 import random
    38
    39 return [random.randint(1, 100) for _ in range(10)]
    40
    41 # use the @ray.task decorator to parallelize the task
    42 @ray.task(config=RAY_TASK_CONFIG)
    43 def get_mean_squared_value(data: list) -> float:
    44 """
    45 Get the mean squared value from a list of integers
    46 Args:
    47 data (list): List of integers
    48 Returns:
    49 float: Mean value of the list
    50 """
    51 import numpy as np
    52 import ray
    53
    54 @ray.remote
    55 def square(x: int) -> int:
    56 """
    57 Square a number
    58 Args:
    59 x (int): Number to square
    60 Returns:
    61 int: Squared number
    62 """
    63 return x**2
    64
    65 ray.init()
    66 data = np.array(data)
    67 futures = [square.remote(x) for x in data]
    68 results = ray.get(futures)
    69 mean = np.mean(results)
    70 print(f"Mean squared value: {mean}")
    71
    72 data = generate_data()
    73 get_mean_squared_value(data)
    74
    75
    76test_taskflow()
Traditional
1"""
2## Ray Tutorial
3
4This tutorial demonstrates how to use the Ray provider in Airflow to
5parallelize a task using Ray.
6"""
7
8from airflow.decorators import dag
9from airflow.operators.python import PythonOperator
10from ray_provider.operators.ray import SubmitRayJob
11from airflow.models.baseoperator import chain
12from pathlib import Path
13
14CONN_ID = "ray_conn"
15FOLDER_PATH = Path(__file__).parent
16RAY_RUNTIME_ENV = {"working_dir": str(FOLDER_PATH)}
17
18
19def _generate_data() -> list:
20 """
21 Generate sample data
22 Returns:
23 list: List of integers
24 """
25 import random
26
27 return [random.randint(1, 100) for _ in range(10)]
28
29
30@dag(
31 start_date=None,
32 schedule=None,
33 catchup=False,
34 tags=["ray", "example"],
35 doc_md=__doc__,
36)
37def ray_tutorial():
38
39 data = PythonOperator(
40 task_id="generate_data",
41 python_callable=_generate_data,
42 )
43
44 get_mean_squared_value = SubmitRayJob(
45 task_id="SubmitRayJob",
46 conn_id=CONN_ID,
47 entrypoint="python ray_script.py {{ ti.xcom_pull(task_ids='generate_data') | join(' ') }}",
48 runtime_env=RAY_RUNTIME_ENV,
49 num_cpus=1,
50 num_gpus=0,
51 memory=0,
52 resources={},
53 xcom_task_key="SubmitRayJob.dashboard",
54 fetch_logs=True,
55 wait_for_completion=True,
56 job_timeout_seconds=600,
57 poll_interval=5,
58 )
59
60 chain(data, get_mean_squared_value)
61
62
63ray_tutorial()

This is a simple DAG comprised of two tasks:

  • The generate_data task randomly generates a list of 10 integers.
  • The get_mean_squared_value task submits a Ray job on Anyscale to calculate the mean squared value of the list of integers.
  1. (Optional). If you are using the traditional syntax with the SubmitRayJob operator, you need to provide the Python code to run in the Ray job as a script. Create a new file in your dags directory called ray_script.py and add the following code:

    1# ray_script.py
    2@ray.remote
    3def square(x):
    4 return x**2
    5
    6def main(data):
    7 ray.init()
    8 data = np.array(data)
    9 futures = [square.remote(x) for x in data]
    10 results = ray.get(futures)
    11 mean = np.mean(results)
    12 print(f"Mean of this population is {mean}")
    13 return mean
    14
    15if __name__ == "__main__":
    16 parser = argparse.ArgumentParser(description="Process some integers.")
    17 parser.add_argument('data', nargs='+', type=float, help='List of numbers to process')
    18 args = parser.parse_args()
    19
    20 data = args.data
    21 main(data)

Step 4: Run the DAG

  1. In the Airflow UI, click the play button to manually run your DAG.

  2. After the DAG runs successfully, check go to your Ray dashboard to see the job submitted by Airflow.

    Ray dashboard showing a Job completed successfully.

Conclusion

Congratulations! You’ve run a Ray job using Apache Airflow. You can now use the Ray provider package to orchestrate more complex Ray jobs, see Processing User Feedback: an LLM-fine-tuning reference architecture with Ray on Anyscale for an example.