Summary: Join Astronomer and Wherobots for an interactive, hands-on workshop designed to take you from raw data to a dynamic geospatial visualization. In this 90-minute session, you will learn how to orchestrate scalable spatial data pipelines using Apache Airflow and WherobotsDB.
What You Will Build: Every participant will build the same robust pipeline: ingesting data from multiple sources, performing spatial transformations (including spatial joins on buildings and neighborhoods) using Wherobots, and outputting the results to a visualization.
Key Takeaways:
- Modern Orchestration: Learn to use @dag, @task, and the Astro IDE to build DAGs.
- Geospatial Power: Discover how to connect Airflow to Wherobots and utilize the Wherobots Airflow package
- Advanced Airflow Features: Implement Asset-based scheduling, Dynamic Task Mapping, and “Human in the loop” tasks for data validation and retries, callbacks, and more.
- Geospatial Processing: Use cloud hosted spatial data to produce key insights on a recurring basis with large scale joins using weather, buildings, and areas of interest with data from NOAA and Overture Maps
Format:
- Workshop (Tuesday): 90-minute demo and hands-on workshop.
- Office Hours (Thursday): Open Office Hours for Q&A and troubleshooting.
Requirements:
- Wherobots account (trial key will be provided during the workshop)
- Astronomer account
- Workshop is a follow along format for learners of all skill levels
Certification: Participants who submit their completed pipeline by the following Monday will receive an official certificate of completion.