Understanding and Utilizing Data Aggregation

Digital Enterprise Society Podcast - Podcast autorstwa Digital Enterprise Society

Kategorie:

 Data aggregation compiles the information from databases with the intent to prepare combined datasets for data processing, and connecting humans to that data is the mission of today’s guest.  On this episode of The Digital Enterprise Society podcast, Craig Brown and Thom Singer welcome Rob Cooke, CTO and Founder of 3Forge, a lover of computers and computer science.  Rob is constantly thinking about how computers work and has joined the podcast today for a discussion about data aggregation- what it is, what it isn’t, how it can improve data exploration and what the future of data aggregation could really look like.    On today’s podcast, you will learn: Understanding data aggregation  The goal of data aggregation is to take disparate sets of data and combine them into the consumable sets of data.  The result allows humans to make usable meaning out of the data.  The ability to aggregate data is a key part of data exploration.  Data aggregation provides increased clarity and minimizes confusion around definitions.    The data aggregation elevator pitch  The initial approach to data aggregation is to simply leave the data where it is.  Transferring data from one location to another is rarely the best solution.  A willingness to make the needed change is essential to successful aggregation.  Once data discovery is completed, it’s time to move on to the next step.    Handling data from devices  The massive amounts of data that are available now make things both easier and more complicated.  The input of data, the ability to ask questions of your data and get answers back, and the evaluation of real time streamed data are all essential.  As the data is taking place, it is critical to be able to process what is important as quickly as possible to filter critical data.   The four V’s of data Validity- Sometimes the data is just wrong, for however short a period of time.  Variety-  Refers to the variation that comes in data, often without explanation.  Volume- the amount of data that is available has increased exponentially.   Velocity- the speed that the data is coming at you is also increasing.  The four V’s interact and each help to clarify and refine data.  Simulation can help with collecting, analyzing, and improving data.  The next layer that needs to be focused on is being aware of data systems, connecting those systems, and running analysis on the data and feeding it back in.    Continue the conversation with us within the Digital Enterprise Society Community at DigitalEnterpriseSociety.org.   Digital Download: Virtual Round-Table Series

Visit the podcast's native language site