Information empowers decision-making, and business more than ever relies on trustworthy information out of information systems.
But business models are extending with ever more clients, suppliers, products and transactions leading to an exponential increase of data. Data quality is often very poor, which can have a disastrous impact on business: uncompleted or delayed process execution and economic loss.
A simple example
A bicycle dealer receives a customer order for a certain spare part, say a leaf chain. Unfortunately, references in the product catalog are outdated, so a chain wheel is dispatched instead of a leaf chain. The consequence: A dissatisfied customer who has to go through the trouble of returning the product and might decide to not order again in the future. The bicycle dealer on his part faces a costly returns process and the loss of future business.
According to a study , 20% of people surveyed claim they will not do business again with a company that has lost their personal data. But how does this translate into hard dollars? How much EBIT do companies really lose through poor data quality? Companies that already apply data quality solutions can gain valuable insights by identifying the potential economic risk associated with insufficient quality of master data and transactions. This can be achieved through the new concept of Value-Driven Data Quality Management (VD-DQM).
Quantifying the economic impact of poor data quality
VD-DQM is the CAMELOT approach that allows organizations to identify and quantify the economic impact of poor data quality in master data records.
This new concept uses some parts of traditional Data Quality Management, but focuses on transactions using erroneous master data records. The erroneous master data records are flagged and categorized according to the associated risk. Subsequently, the monetary impact is measured.
The concept deep-dives into an organization’s active data, applying customized business data rules to filter, segment and analyze the most essential information.
VD-DQM is process-centric. This means it analyzes core business processes consuming active master data that trigger the creation of new transactional documents throughout the value chain in the entire organization. VD-DQM operates through the measurement of KPIs linked to business rules, inspecting key data attributes to determine the levels of quality according to business quality specifications.
VD-DQM provides extensive multivariable reports, mainly, economic risk and impact, but also allows to cluster data defects into quality dimensions like completeness, conformity, consistency, integrity, accuracy, uniqueness and timeliness.
After identifying potential defects, the associated risk and economic impact, VD-DQM helps to initiate the remediation processes on master data objects in order to improve an organization’s data lifecycle management and prevent further data errors, thus saving time and money in reactive initiatives.
In summary, VD-DQM is a holistic concept to meet current challenges in Data Quality Management. It enables decision-making based on monetary facts and helps organizations to reduce the risk of data quality issues that may cause operational inefficiency and lead to economic losses. VD-DQM also prepares organizations for the new era of digital transformation by updating traditional Data Quality Management models.
 Cai, L. & Zhu, Y., (2015). The Challenges of Data Quality and Data Quality Assessment in the Big Data Era. Data Science Journal. 14, p.2.