Business needs to rely on its data to make business decisions, execute marketing campaigns, make analyses and manage supply chains and inventory. Bad quality, poorly populated information can end-up costing millions.
But, unfortunately, few companies give thought to the quality and integrity of their data until it`s called into question when a marketing campaign falls flat or a business intelligence project fails to get off the ground - usually long after there has been significant investment in time, money and resources.
This is according to Paul Morgan, managing director of ASYST Intelligence, a focused provider of strategic business intelligence and data management solutions.
"Organisations often operate with blinkers on because the information in their systems is incorrect or badly populated. At the end of the day, doubtful, deficient data means that management information is unreliable and can`t be trusted. So, any business decisions made based on that information may prove to be harmful," he says.
Inaccurate or incorrect data can in fact damage every aspect of a business. On the customer side, outdated information or incomplete information could mean a failed marketing campaign or a forgotten customer. For instance, if a customer has moved and the address has not been updated in the system, bills and direct communication don`t reach the right people - affecting debt collection and marketing initiatives.
In the supply chain, poor product data can hamper production and the delivery of orders. Duplicate information also costs money when three mailers are sent to the same person whose name is recorded in three different formats on a system or when more than one product code for the same item results in a company re-stocking inventory that is already in stock.
Inaccurate data can also land companies on the wrong side of the law. A financial services institution, for example, can find itself in contravention of certain regulatory requirements if its data is inaccurate or incomplete.
"Bad data simply costs money," quips Morgan.
Companies spend huge amounts of money on all kinds of systems and solutions, but in order to get value out of these systems, Morgan says they must be able to reply on their enterprise data. As such, data quality should be top of mind for any company that wants to see return on their investment in enterprise resource planning systems, business intelligence solutions and the like.
"In fact, data quality should be a priority for any organisation that has information stored in multiple systems or divisions across the business, or for businesses that deal with large volumes of information that needs to be consolidated and contextualised, whether it be for the purposes of analyses or simply make contact with customers.
"Companies need to get their data quality management right before they can even begin to extract what they need out of the information.
"This means ensuring their data is rich, complete, properly populated and in the right place. At the end of the day, garbage in means they will get garbage out," adds Morgan.
Morgan says bad quality data is the result of poor data capturing, poorly integrated and disparate technological systems as well as badly designed, planned and implemented business and operational processes and standards.
"Unfortunately, it is not always easy or possible to control data entering a system. People both within an organisation and customers, as data producers, are probably the main causes of poor quality data. People can be lazy and tend to do the bare minimum when it comes to capturing data. Take someone entering personal information on an insurance Web site, for example. They only fill in the manual fields, which results in missing data that might be necessary for the insurance provider to do analyses at a later stage.
"Then there is the issue of human error and lack of user training," explains Morgan.
So what can companies do to clean and protect the integrity of their data?
Morgan says data quality tools can help ease the headache of managing information quality within a company. Data quality tools allow companies to build rules that help define data quality. Validation sequences can be created to prompt input of data as it is captured to control the quality of the information entering a system. They also enable companies to consolidate and contextualise information stored in different systems in different parts of the business. Once the data has gone through a data quality process, ad hoc checks of the data can be run regularly to highlight where information is missing, in duplicate or incomplete using pattern-matching techniques.
"It`s basic housekeeping. Just like you wash your clothes and dust your windowsills regularly, so companies need to wash and refresh their data on a regular basis," comments Morgan.
He concludes, saying companies need to look for a data quality solution with advanced matching and standardisation capabilities that will allow them to analyse, clean and standardise data across all platforms.
"Ideally, they should commission data quality consultants to conduct an analysis of why and where the quality of their data is being compromised, and advise them on selecting the most appropriate technology to address the problem," he concludes.
Share