Rerelease of #150 3 Years in, Data Mesh at eDreams: Small Data Products, Consumer Burden, and Iterating to Success, Oh My! - Interview w/ Carlos Saona
Data Mesh Radio - Podcast autorstwa Data as a Product Podcast Network
Kategorie:
Due to health-related issues, we are on a temporary hiatus for new episodes. Please enjoy this rerelease of episode 150 with Carlos Saona. eDreams' approach is very unique and interesting because it was essentially all on its own so there are a ton of useful learnings to consider if they are the right fit for your own organizations.Sign up for Data Mesh Understanding's free roundtable and introduction programs here: https://landing.datameshunderstanding.com/Please Rate and Review us on your podcast app of choice!If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereEpisode list and links to all available episode transcripts here.Provided as a free resource by Data Mesh Understanding / Scott Hirleman. Get in touch with Scott on LinkedIn if you want to chat data mesh.Transcript for this episode (link) provided by Starburst. See their Data Mesh Summit recordings here and their great data mesh resource center here. You can download their Data Mesh for Dummies e-book (info gated) here.Carlos' LinkedIn: https://www.linkedin.com/in/carlos-saona-vazquez/In this episode, Scott interviewed Carlos Saona, Chief Architect at eDreams ODIGEO.As a caveat before jumping in, Carlos believes it's too hard to say their experience or learnings will apply to everyone or that he necessarily recommends anything they have done specifically but he has learned a lot of very interesting things to date. Keep that perspective in mind when reading this summary.Some key takeaways/thoughts from Carlos' point of view:eDreams' implementation is quite unique in that they were working on it without being in contact with other data mesh implementers for most of the last 3 years - until just recently. So they have learnings from non-typical approaches that are working for them.You should not look to create a single data model upfront. That's part of what has caused such an issue for the data warehouse - it's inflexible and doesn't really end up fitting needs. But you should look to iterate towards that standard model as you learn more and more about your use cases.?Controversial?: Look to push as much of the burden as is reasonable onto the data consumers. That means the stitching between data products, the compute costs of consuming, etc. They get the benefit so they should be taking on the burden. Things like data quality are still on the...