Filed under: Analytics, Business Intelligence, Data Governance, Data Quality, Performance Measures
This recent article about analytics suggests that IT executives worry too much about data quality. However, the provocative headline (well, at least provocative to me) is a conclusion drawn out of survey results, and that interpretation may reflect some flawed thinking.
Filed under: Data Governance, Data Quality, Performance Measures
Thanks to my friends at Informatica, I have been able to allocate some time to organizing and documenting some ideas about understanding different types of business impacts related to data quality improvement and how they can be isolated, organized, measured, and communicated to a variety of stakeholders across the organization.
There are four papers:
Increasing Confidence, and Satisfaction Through Improved Data Quality (link coming soon)
Last week I attended the Data Warehousing Institute’s World Conference in Las Vegas, teaching a class on Practical Data Quality Management. As part of the discussion on critical data elements, I suggested some alternatives for qualifying a data element as “critical,” including “presence on a published report.” In turn, I pointed out an interesting notion that often does not occur to many data analysts, but does require some attention: monitoring the quality of non-persistent data.
I have just finished a paper sponsored by Informatica titled “Understanding the Financial Value of Data Quality Improvement,” which looks at bridging the communications gap between technologists and business people regarding the value of data quality improvements. Here is a summary:
As opposed to the technical aspects of data validation and cleansing, often the biggest challenge in beginning a data quality program is effectively communicating the business value of data quality improvement. But using a well-defined process for considering the different types of costs and risks of low-quality data not only provides a framework for putting data quality expectations into a business context, it also enables the definition of clear metrics linking data quality to business performance. For example, it is easy to speculate that data errors impede up-selling and cross-selling, but to really justify the need for a data quality improvement effort, a more comprehensive quantification of the number of sales impacted or of the total dollar amount for the missed opportunity can be much more effective at showing the value gap.
This article looks at different classifications of financial impacts and corresponding performance measures that enables a process for evaluating the relationship between acceptable performance and quality information. This article is targeted to analysts looking to connect high quality information and optimal business performance to make a quantifiable case for data quality improvement.
Filed under: Business Impacts, Business Intelligence, Data Analysis, Data Quality, Master Data, Metrics, Performance Measures
It would be unusual for there to be a company that does not use some physical facility from which business is conducted. Even the leaders and managers of home-based and virtual businesses have to sit down at some point, whether to access the internet, make a phone call, check email, or pack an order and arrange for its delivery. Consequently, every company eventually must incur some overhead and administrative costs associated with running the business, such as rent and facility maintenance, as well as telephones, internet, furniture, hardware, and software purchase/leasing and maintenance.
Today’s thoughts are about that last item: the costs associated with building, furniture, machinery, software, and grounds maintenance. There is a balance required for effective asset maintenance – one would like to essentially optimize the program to allocate the most judicious amount of resources to provide the longest lifetime to acquired or managed assets.
As an example, how often do offices need to be painted? When you deal with one or two rooms, that is not a significant question, but when you manage a global corporation with hundreds of office buildings in scores of countries, the “office painting schedule” influences a number of other decisions regarding bulk purchasing of required materials (e.g. paint and brushes), competitive engagement of contractors to do the work, temporary office space for staff as offices are being painted, etc., which provide a wide opportunity for cost reduction and increased productivity.
And data quality fits in as a byproduct of the data associated with both the inventory of assets requiring maintenance and the information used for managing the maintenance program. In fact, this presents an interesting master data management opportunity, since it involves the consolidation of a significant amount of data from potentially many sources regarding commonly-used and shared data concepts such as “Asset.” The “Asset” concept can be hierarchically organized in relation to the different types of assets, each of which exists in a variety of representations and each of which is subject to analysis for maintenance optimization. Here are some examples:
- Fixed assets (real property, office buildings, grounds, motor vehicles, large manufacturing machinery, other plant/facility items)
- Computer assets (desktops, printers, laptops, scanners)
- Telephony (PBX, handsets, mobile phones)
- Furniture (desks, bookcases, chairs, couches, tables)
I think you see where I am going here: errors in asset data lead to improper analyses with respect to maintenance of those assets, such as arranging for a delivery truck’s oil to be changed twice in the same week, or painting some offices twice in a six month period while other office remain unpainted for years. Therefore, there is a direct dependence between the quality of asset data and the costs associated with asset maintenance.