Getting data from point A to point B requires data pipelines, whether that's putting Facebook ad data into Redshift or adding a third party data set to a data lake. Considering the fact that popular sources like Salesforce or GitHub house a LOT of data—not all of which is necessary to answer a company's business questions—moving all of it without intentionality can create a lot of "noise." Our answer to getting focused, valuable data is recommending reusable "recipes."
In this episode, RBK calls in one of our data engineers to tell us more about what recipes are and how they work:
Have more questions? Ask RBK at email@example.com. He'll get you an answer—and it might even end up on the next episode.
Ready to build your data workflows with Airflow?
Astronomer is the data engineering platform built by developers for developers. Send data anywhere with automated Apache Airflow workflows, built in minutes...