New Paper: Understanding the Financial Value of Data Quality Improvement

February 2, 2011 by · 2 Comments
Filed under: Data Quality, Metrics, Performance Measures 

I have just finished a paper sponsored by Informatica titled “Understanding the Financial Value of Data Quality Improvement,” which looks at bridging the communications gap between technologists and business people regarding the value of data quality improvements. Here is a summary:

As opposed to the technical aspects of data validation and cleansing, often the biggest challenge in beginning a data quality program is effectively communicating the business value of data quality improvement. But using a well-defined process for considering the different types of costs and risks of low-quality data not only provides a framework for putting data quality expectations into a business context, it also enables the definition of clear metrics linking data quality to business performance. For example, it is easy to speculate that data errors impede up-selling and cross-selling, but to really justify the need for a data quality improvement effort, a more comprehensive quantification of the number of sales impacted or of the total dollar amount for the missed opportunity can be much more effective at showing the value gap.

This article looks at different classifications of financial impacts and corresponding performance measures that enables a process for evaluating the relationship between acceptable performance and quality information. This article is targeted to analysts looking to connect high quality information and optimal business performance to make a quantifiable case for data quality improvement.

Quality of Master Data and Data Governance Maturity

November 9, 2010 by · Leave a Comment
Filed under: Data Quality, Master Data, Performance Measures 

For a current project I am looking at performance criteria associated with implementing data governance for master data management. One aspect is defining performance measures relating to the lifecycle of master data. As a result, one area under consideration is the quality of master data, and so far there are two aspects associated with the quality of master data:

1) The extent to which the data in a master repository meets the needs of the downstream consumers, and

2) The degree to which the data in the systems of entry meets the needs of the downstream consumers.

To some extent, these focus on the same requirements, but the difference lies in overseeing data validation. Compliance of the master data to defined business rules can be assessed after the fact – that is, after the data has been integrated within a unified master view. However, instituting compliance to those rules before the data has been “integrated” within the unified view implies a more comprehensive agreement regarding enterprise validation. That is because each owner of a system of entry must agree to validate the data against the enterprise requirements (and not just his or her own application’s expectations). And this implies a more sophisticated level of data governance maturity across the organization.

As I flesh out the performance criteria, I will update this site with more information. However, you can read more about data quality maturity by looking to your right and clicking on the link for the free book chapter from my book.