Filed under: Business Impacts, Business Intelligence, Data Analysis, Data Quality
The Practitioner’s Guide to Data Quality Improvement has been included in the IT feed at Alltop!
Filed under: Business Impacts, Data Quality, Master Data, Metadata
A few months back I shared a post about proper scoping of a master data management activity to focus specifically on a smaller subset of business activities that (a) are impacted by the absence of a unified view of data or (b) can be measurably improved through the facilitation of a unified view of data. Part of the actualization of that unified view involves selecting the right business activities and then getting a better understanding of the data needs of those business activities.
Filed under: Business Impacts, Data Quality, Identity Resolution, Master Data
Yesterday I shared some thoughts about the differences between data validity and data correctness, and why validity is a good start but ultimately is not the right measure for quality. Today I am still ruminating about what data correctness or accuracy really means.
For example, I have been thinking for a long time about the existence (or more accurately, nonexistence) of benchmarks for data quality methods and tools, especially when it comes to data accuracy. On the one hand, I often see both vendors and their customers reporting “accuracy percentages” (e.g. “our customer data is 99% accurate”) and I wonder what is meant by accuracy and how those percentages are both calculated and verified.
As an author, I get royalty payments for each of the books that I have written (actually, I get royalties when people actually buy the books). And twice a year, I get a statement that details the accounting for the book sales (accompanied by a check for a relatively small amount of money – buy more books!). Since I am prolific and have written a number of books, my royalty statement lists each book in each publication format (e.g., print vs. electronic), the amount of sales, and what my royalty amount is.
Filed under: Business Impacts, Data Governance, Data Quality
Coincidentally, similar issues came up with different clients over the last week or so, focusing on ensuring the quality of elements of a specific data domain. In both environments information flowed from a number of source systems in the organization, usually starting with some customer-facing process, and in other cases, with data feeds coming from external sources. But what was obvious was that as these different systems have come on line, there had been little documentation of how the processes acquired, read, or modified the data. As a result, when an error occurred, it manifested itself in a downstream application, and it took a long time to figure out where the error occurred and how it was related to the negative impact(s).