Introducing Cohort 6 of the Astronomer Champions Program for Apache Airflow®
1 min read |
We’re thrilled to announce the launch of Cohort 6 of the Astronomer Champions Program for Apache Airflow®. This new cohort brings together an outstanding group of data engineers, architects, and platform leaders from around the world who share a deep commitment to Apache Airflow® and to advancing the practice of data orchestration.
Cohort 6 reflects the global reach and growing maturity of the Airflow ecosystem. These champions represent a wide range of industries, including healthcare, fintech, retail, cybersecurity, marketing analytics, and cloud platforms. Each brings real-world experience running Airflow at scale in demanding production environments.
From modernizing legacy schedulers and orchestrating machine learning workflows to building observability-first platforms and leading local Airflow communities, Cohort 6 champions are shaping how Airflow is adopted, taught, and evolved across the data engineering landscape.
Let’s meet Cohort 6 of the Astronomer Champions Program for Apache Airflow®.
Airflow Journey:
Sushmasree is a Data Engineer at Optum with over four years of hands-on experience using Apache Airflow in large-scale, production environments. She leverages Airflow to orchestrate critical analytics and data platform workloads, with a strong focus on DAG reliability, scalability, and maintainability. Sushmasree regularly applies modern Airflow best practices, including the TaskFlow API, dynamic DAG generation, and XComs. Outside of her day-to-day work, she contributes actively to the community by publishing practical Airflow blogs on Medium that help engineers apply best practices and prepare for real-world use cases.
Airflow Journey:
Aldo is a Data Engineer at Mutt Data with extensive experience using Apache Airflow across a wide range of environments, from self-managed deployments to managed platforms such as Astronomer, AWS MWAA, and Google Cloud Composer. His work centers on refactoring and modernizing data pipelines, orchestrating machine learning workflows as ELT processes, and integrating analytics engineering tools like dbt through Cosmos. He has led and contributed to large migrations involving thousands of workflows from legacy schedulers to code-based Airflow implementations, often using Orbiter as a primary migration tool. He actively contributes to the Apache Airflow ecosystem through community projects and technical discussions.
Airflow Journey:
Shriya is a Senior Data Engineer at Walmart International, where she specializes in sourcing, supply chain, and finance data for global operations. With over eight years of experience, she focuses on optimizing data pipelines, ensuring data quality, and enabling business-critical insights at scale. Passionate about AI and data-driven transformation, Shriya is also pursuing an MBA in Strategic Management, deepening her ability to connect technical execution with business strategy. Her blend of engineering expertise and strategic thinking allows her to approach data challenges holistically and drive measurable impact across organizations.
Airflow Journey:
Krishnaprasad is a Senior Data Engineer at Prevalent AI, a UK-based cybersecurity company, with over five years of experience using Apache Airflow. At Prevalent AI, Airflow orchestrates the majority of production workflows, and Krishnaprasad plays a key role in designing and managing scalable ETL pipelines using TaskFlow, XComs, and dynamic DAG generation. Outside of his professional work, he actively contributes to the Airflow community by writing educational articles and helping engineers deepen their understanding of effective orchestration practices.
Airflow Journey:
Sahas is a Senior Software Engineer at Zelis with deep expertise in architecting large-scale workflow orchestration systems using Apache Airflow. He has designed and optimized high-volume data pipelines and built custom Airflow operators to support complex business logic. Throughout his career, Sahas has championed Airflow adoption across organizations by mentoring engineers, guiding teams, and establishing best practices around DAG design, observability, and reliability. He remains deeply engaged with the Airflow ecosystem and continues to support community-driven innovation.
Airflow Journey:
Jarosław is a Data Platform Engineer and Data Solution Architect who has worked with Apache Airflow for several years to orchestrate large-scale data platforms that process billions of records daily. His work focuses on designing robust data architectures, improving DAG design and scheduling reliability, and leveraging features such as dynamic DAG generation, custom operators, sensors, and XComs. He enjoys sharing practical knowledge and best practices with the data engineering community.
Airflow Journey:
Yeonguk is a Software Engineer specializing in financial data with extensive experience using Apache Airflow. He organizes the Apache Airflow Korea User Group, where he supports the local community through meetups, collaboration, and practical knowledge sharing. Through his leadership, Yeonguk plays an important role in expanding the Apache Airflow ecosystem in Korea.
Airflow Journey:
Nitin is a Data Engineer at Razorpay, one of India's leading full-stack fintech platforms. At Razorpay, Airflow is a core orchestration tool used across data engineering, analytics, and data science teams. Nitin has worked extensively on complex pipelines involving custom sensors, XComs, and external dependencies. His recent work focuses on improving observability and reliability across systems orchestrated with Airflow. He stays involved in the community through meetups, forums, and content on Medium and LinkedIn, with a focus on making Airflow easier to learn and use.
Airflow Journey:
Nikhil is a Data Solution Architect at Cloudaeon Pvt Ltd with over five years of hands-on experience using Apache Airflow. He has implemented complex orchestration logic, improved scheduling reliability, and built scalable workflows using TaskFlow, sensors, XComs, and dynamic DAG generation. Nikhil also developed an accelerator to support migration from Airflow 2 to Airflow 3. Beyond his professional work, he actively contributes to the community by leading knowledge-sharing sessions, writing articles, and helping engineers adopt future-ready orchestration best practices.
Airflow Journey:
Shrividya is a Senior SDET with over eight years of experience building quality-first, cloud-native data platforms. At Astronomer, she has leveraged Apache Airflow to orchestrate, validate, and scale complex data pipelines. She has embedded automated data quality checks, improved observability, and increased deployment confidence. Shrividya is a strong advocate for Airflow's Python-native DAGs, TaskFlow API, flexible scheduling, and rich metadata, and actively promotes best practices in quality engineering and production readiness.
Airflow Journey:
Aman is an Associate Lead Data Engineer at Forbes Advisor with over five years of experience using Apache Airflow in production. He has built and operated large-scale DAGs across organizations including Condé Nast and Forbes Advisor, where Airflow orchestrates critical data pipelines on GCP. Aman focuses on designing reliable, scalable workflows using sensors, dynamic DAGs, custom operators, and Airflow-driven data quality checks. He is passionate about observability, failure handling, and operational excellence, and enjoys sharing practical learnings with the community.
Airflow Journey:
Josip designs data systems that quietly do their job so people can do theirs better. He runs a marketing analytics and data engineering agency where Apache Airflow orchestrates pipelines that turn messy marketing data into decisions teams can trust. His work sits at the intersection of engineering, analytics, and human behavior, with a focus on building systems that scale cleanly and deliver clarity without unnecessary noise.
Airflow Journey:
Alan is an Analytics Engineer and AI Data Solution Architect at Liti, where he designs multicloud data platforms and builds event-driven pipelines using Python, Pub/Sub, Apache Airflow, BigQuery, and dbt. He has also worked on AI-powered workflows that use Airflow alongside AWS Bedrock multi-agent systems to automate medical exam analysis and scale clinical indicator processing. Alan has used Apache Airflow professionally since 2021 and holds Astronomer certifications in Airflow Fundamentals and DAG Authoring. He is also a content creator at Jornada de Dados, producing educational YouTube content and hands-on workshops on data engineering and orchestration best practices.
Airflow Journey:
Sebastián is a Data Engineer at Wizeline, where he designs and optimizes scalable, cloud-native data pipelines using Apache Airflow, Spark, Databricks, and AWS services. His work focuses on improving DAG reliability, optimizing data partitioning strategies, and supporting both batch and near-real-time processing. An AWS Captain and active technology speaker, Sebastián contributes to the community through talks, workshops, and mentorship, helping engineers grow both technically and professionally within the data ecosystem.
Airflow Journey:
Ajay is a Software Engineer at Walmart Global Tech, currently working in Site Reliability Engineering (SRE). Previously, he worked on Data in Motion platforms, where he contributed to making Kafka-based event streaming platforms PCI compliant and scalable for real-time data processing. He has been using Apache Airflow for close to a year to orchestrate and manage complex data pipelines and workflows from diverse data sources, with a strong focus on improving scheduling reliability through effective DAG design, Airflow DAG features, and automation best practices. He is Astronomer-certified in Apache Airflow 3 and DAG Authoring for Apache Airflow 3, and more recently his team adopted the Astro control plane to manage Airflow instances, using Airflow as a centralized workflow scheduler across platforms such as Spark and MariaDB. Ajay is passionate about building reliable, scalable systems, improving developer experience through automation, and applying best practices to operate data platforms at scale.
Airflow Journey:
Richa is a Software Engineer 2 working on scalable data services that support analytics and operational workflows. She applies strong engineering fundamentals to building reliable pipelines and integrations that power business systems. With experience across the data stack, Richa focuses on writing clean, maintainable code and improving workflow reliability. She enjoys learning orchestration patterns that make complex systems easier to operate.
Airflow Journey:
Sachin designs and delivers robust data orchestration solutions for large-scale platforms. He builds dynamic, failure-resilient workflows and focuses on reliability engineering to support data systems at scale. Sachin enjoys tackling distributed systems challenges and evolving orchestration patterns that adapt to growing business demands. He brings a pragmatic approach to using Apache Airflow in complex production environments.
Airflow Journey:
Andre blends technical leadership with hands-on engineering to build and maintain scalable data pipelines. His work centers on orchestrating reliable workflows that support analytics and operational reporting. With over a decade of experience, Andre takes a practical approach to problem solving and focuses on solutions that are resilient and easy to maintain. He values developer experience and clean system design.
Airflow Journey:
Vishal builds and operates large-scale data pipelines in production environments. He uses Apache Airflow to coordinate critical ETL workflows, with a focus on simplicity, reliability, and operational clarity. His day-to-day work involves improving scheduling logic, managing dependencies, and reducing pipeline failures. Vishal enjoys sharing practical learnings from real-world Airflow usage.
Airflow Journey:
Mostafa works on end-to-end data workflows that support analytics and product use cases at scale. He uses Apache Airflow to build dependable pipelines that move and transform data reliably. Mostafa cares deeply about observability, performance, and data quality, and he looks for ways to make systems easier to operate and troubleshoot. He enjoys collaborating closely with cross-functional teams.
Airflow Journey:
Mahalaxmi works on data orchestration and platform reliability for financial services workflows. She uses Apache Airflow to design and automate pipelines that support analytics and reporting. Her focus is on improving workflow clarity, reducing operational overhead, and building systems that teams can trust. Mahalaxmi enjoys solving orchestration challenges and delivering dependable data solutions.
Airflow Journey:
Dmytro builds and supports large-scale ETL pipelines orchestrated with Apache Airflow. He brings hands-on experience running Airflow in production and working with modern orchestration features to manage complex workflows. Dmytro enjoys improving pipeline performance and reliability and sharing patterns that help teams operate production systems with confidence.
Airflow Journey:
Thomas focuses on building reliable, scalable data platforms with Apache Airflow at their core. Since 2017, he has deployed Airflow across a range of environments and authored hundreds of production DAGs. He cares deeply about clean workflow design, maintainability, and long-term scalability. Thomas enjoys sharing practical patterns that help teams grow their data infrastructure responsibly.
Airflow Journey:
Dmitrii builds and optimizes data workflows orchestrated with Apache Airflow. He brings experience designing dependable pipelines and improving observability across systems. His work includes integrating modern tools such as dbt and Snowflake to improve how data workflows are developed and operated. Dmitrii enjoys refining systems for reliability and sharing what he learns with the data engineering community.
Interested in Being a Champion?
The Astronomer Champions Program recognizes and supports leaders who are shaping the future of Apache Airflow through technical excellence, community engagement, and real-world impact.
Interested in becoming a Champion? Learn more here, and apply for the 2026 Cohort!