Be Our Guest
Interested in being a guest on The Data Flowcast? Fill out the form and we will be in touch.
Using Airflow to orchestrate geospatial data pipelines unlocks powerful efficiencies for data teams. The combination of scalable processing and visual observability streamlines workflows, reduces costs and improves iteration speed.
In this episode, Alex Iannicelli, Staff Software Engineer at Overture Maps Foundation, and Daniel Smith, Senior Solutions Architect at Wherobots, join us to discuss leveraging Apache Airflow and Apache Sedona to process massive geospatial datasets, build reproducible pipelines and orchestrate complex workflows across platforms.
Key Takeaways:
00:00 Introduction.
03:22 How merging multiple data sources supports comprehensive datasets.
04:20 The value of flexible configurations for running pipelines on different platforms.
06:35 Why orchestration tools are essential for handling continuous data streams.
09:45 The importance of observability for monitoring progress and troubleshooting issues.
11:30 Strategies for processing large, complex datasets efficiently.
13:27 Expanding orchestration beyond core pipelines to automate frequent tasks.
17:02 Advantages of using open-source operators to simplify integration and deployment.
20:32 Desired improvements in orchestration tools for usability and workflow management.
Resources Mentioned:
Overture Maps Foundation website
Thanks for listening to “The Data Flowcast: Mastering Apache Airflow® for Data Engineering and AI.” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations.
Interested in being a guest on The Data Flowcast? Fill out the form and we will be in touch.
Try Astro today and get up to $20 in free credits during your 14-day trial.