In my previous column, I expanded on the meaning of data culture. Here, I reveal the critical elements that need to be in place to build data culture, namely:
- Data
- Technology − helps transform data into insights
- People − drive actions that transform business
But it is the orchestration of all three elements that ensures the purpose of data is achieved, which is improved actions due to data-driven decision-making.
Data quality or data consistency?
Many a data project has crashed and burned on the rocks of data quality. Once critical corporate data has been inaccurately used and has caused damage to a user, the rest of the organisation will be reticent to be exposed to the same risk.
Companies will find it can set back a data initiative for years as people revert to data sources and Excel − which then gives rise to multiple versions of the truth and increases human error risks. So, it is important to provide a trusted data store, as well as access to data that is fit for purpose.
It may be acceptable for a trend analysis to be 3% out, but financial reporting needs to be accurate to the decimal point.
It isn't just about having volume, velocity, or variety − also known as the three Vs of data. It is necessary to examine which part of the data pipeline could add the most value to existing estate/capabilities.
Taking data from raw state to analytics-ready is complex and often businesses do not understand the amount of work it takes.
More than ever, it is about the quality of the data companies have, or can gain access to, that makes or breaks the success of data initiatives. Executives of all types need to understand and embrace this as a top issue for their organisation.
Higher quality data equates with higher quality results. Data is the lifeblood of every successful contemporary organisation. It is in everything they do.
Management of the supply chain from the source of production through to the final customer is monitored, managed, and sometimes predicted, as a result of data − latterly the internet of things is leveraged as an enabling tool. Analytics programs also assist to uncover missed opportunities.
Understanding the customer and employee experience or journey demands data capture and use, and possibly augmented by turning to artificial intelligence (AI) built on datasets for advantage. The bottom line is the world now spins on data.
However, it isn't just about having the three Vs of data, or the big data movement of recent years. It isn't just about having data scientists or a chief data officer; it is far more complex.
Data fabric and the use of catalogues
One of the large challenges we face in business intelligence (BI) is the time needed to respond to fast-changing business requests and challenges. This leads to a fragmentation of data culture, as we see businesses starting shadow BI projects and different business units creating silos to respond to the need for more agile data.
The nature of taking data from raw state to analytics-ready is complex and often businesses do not understand the amount of work it takes to set up the connections, transfer and transformation of data and get it into a production state via manual coding.
Once we have the data set up to answer questions, we must build new flows and transformation. This causes great frustration and often causes shadow BI projects to pop up in business units that cause multiple versions of the truth and silos of competing data owners.
This is a real challenge. Businesses seem to swing between enterprise data warehouse (EDW) projects that inevitably run from months to years, to federated models where everybody does what is needed for business agility.
Basically, the process goes on and on.
One of the biggest trends that promise to promote agility in the space of the EDW/lake, is automation technologies. AI and automation research has highlighted that 45% of manual work and coding can be automated.
This not only holds big savings for the tech organisation, but facilitates the capability to respond to specific business needs much quicker. This, in turn, drives closer collaboration and partnership between IT and business, while not compromising the ultimate need for a single version of the truth.
Real-time data: Next business imperative?
Most use cases data can be at rest: if we batch-load every 10 minutes, every date or at the end of the month, it is perfectly fine for the purpose of reporting or analytics. Not all data is equal, and not all data needs to be active (real-time) but it is important to understand the difference.
Real-time means different things to different people. It's all relative but in the window of time that best suits your needs, what do you want to accomplish? What business problem are you trying to solve?
Business imperative use case examples include customer experience, credit risk, insurance approval and probably the obvious one − fraud. Identify the use case within the business where faster data will translate into savings and/or profit.
Due to the increase in the number of sources and bandwidth speed, we now have the opportunity to get insights into business moments and react while the opportunity or threat is real. Acting in the moment will give the business the edge to outperform its competitors and in a hyper-competitive world, this is immensely valuable.
Often, the technologies we work with were designed in an era of batch and so we do need to consider if an element of real-time capability needs to be added to architecture and capability.
Change data capture is an ideal way to augment current architecture, and as we all know − just like the early bird − early adopters get the biggest benefit if they get it right.
In my next column in this series, I will discuss how technology enables us to meet today's data demands.
Share