In my previous article, I outlined the challenges faced by data leaders today. Here I would like to expose the difficulties involved in timeously getting information to the correct target.
Many 2023 business intelligence trend predictions talk to the historic focus of IT to provide the right information to the right user at the right time. The importance of this action has not diminished, but in fact is now more relevant than ever. But in a fragmented data world where data is distributed across multiple platforms, time is scarce and skilled resources are even more rare, it is tougher to achieve this goal.
Fortunately, we do not have to get all the data to all the people all the time. Having the right slices of data at the right time is more useful. Moreover, not every insight has to be arrived at through user exploration. Many – delivered straight from the data − can be more prescriptive and recommendation-oriented.
Data storytelling has been touted as the way to get data to make sense to users; stories can reach people emotionally and compel them to act when data alone does not. to different audiences does mean the foundation must be robust.
The more pertinent question is, how do we create this single point of the truth? I first heard this phrase a quarter of a century ago but again heard it very recently. The bottom line is this is not a new challenge. Fortunately, we can go back and learn from the past.
Fortunately, we do not have to get all the data to all the people all the time.
In the past, our challenges were different. Disc space was expensive, so we had to come up with a way to store data only once and from there any reporting tool could read the one source of the truth, which was the data warehouse. Even the creation of a data warehouse had its challenges.
Today, the key concepts of a landing and staging area and finally the data layer (store) from where we build reports and dashboards managed to stick and become a mainstay of the data pipelines we find in most organisations. Albeit many businesses find their data pipelines under pressure due to the numerous amounts of data sources we have to include in the landing areas, like the data warehouse/cloud data warehouse or data lakes.
The concept remains the same even with the new storage technologies and massive data volumes. The goal is to get meaningful timeous insights out of the landing/staging areas, moving the storage and infrastructure costs to the cloud, where rapid deployment and automation can thrive and ultimately drive down total cost of ownership, while delivering those meaningful insights in record timelines.
Building the environments did come at a price, as this was a labour-intensive exercise to get the data in an analytic-ready state. Today, we have three elements to derive business value from, namely: data, technology and people.
Focusing for a moment on the technologies, over time we can see how it has advanced in the processing of data. In today’s world, we have modernised data integration technologies that will allow the building of data pipelines (the right data slice to the right audience). The technology does the automation part and creates the landing and staging area, plus the data layer (store) space.
These capabilities exist today due to human understanding of why we needed to create these areas; namely, to address the need within a process to get to a conformed data layer (store).
The purpose of the landing area is to form a pre-staging layer, where tables in this layer are used to receive batch loads from multiple source systems. The staging area is the layer that is used to hold cleansed and standardised data before loading to the final target.
These areas are key in the extract, transform and load process. With the advancement of technology and human experience, this entire process is being automated, as we move more into a world of no-code or low-code capabilities utilising technologies to do what they do best and that is the automation of repetitive tasks.
Allowing technologies to become part of business ecosystems, maintaining the standard processes will become the norm and will allow skilled humans to focus on non-standard enhancements and configurations.
Organisations need to understand that the latest next-generation technologies facilitated the fast-tracking of ‘serving the right data slice to the right person at the right time’ in a governed manner.
Once that fundamental is grasped, it follows that the key principles still remain, which are to have the concepts of a landing and staging area, as well as a data layer (store) area, intact. It is because of this that we can serve the right data to the right consumers.
Share