Be Our Guest
Interested in being a guest on The Data Flowcast? Fill out the form and we will be in touch.
The shift to a unified data platform is reshaping how pharmaceutical companies manage and orchestrate data. Establishing standards across regions and teams ensures scalability and efficiency in handling large-scale analytics.
In this episode, Evgenii Prusov, Senior Data Platform Engineer of Daiichi Sankyo Europe GmbH, joins us to discuss building and scaling a centralized data platform with Airflow and Astronomer.
Key Takeaways:
00:00 Introduction.
02:49 Building a centralized data platform for 15 European countries.
05:19 Adopting SaaS to manage Airflow from day one.
07:01 Leveraging Airflow for data orchestration across products.
08:16 Teaching non-Python users how to work with Airflow is challenging.
12:25 Creating a global data community across Europe, the US and Japan.
14:04 Monthly calls help share knowledge and align regional teams.
15:47 Contributing to the open-source Airflow project as a way to deepen expertise.
16:32 Desire for more guidelines, debugging tutorials and testing best practices in Airflow.
Resources Mentioned:
Daiichi Sankyo Europe GmbH website
Thanks for listening to “The Data Flowcast: Mastering Apache Airflow® for Data Engineering and AI.” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations.
Interested in being a guest on The Data Flowcast? Fill out the form and we will be in touch.
Try Astro today and get up to $500 in free credits during your 14-day trial.