I just finished reading a very interesting book on the evolution of Prohibition in the US in the mid-late 1800s and early 1900s. The book, “Last Call” by Daniel Okrent, followed the temperance movement that started with a bunch of men pledging to stop drinking through its alignment with the women’s suffrage movement, to the passage of the Prohibition amendment, followed by its eventual repeal. One revelation to me was that , according to the author, the political processes that enabled the passage of prohibition essentially created the modern methods of political lobbying, the ability of minority parties to significantly sway majority rule, and (when push comes to shove) that when you mandate behavioral changes, you probably should have four things in mind:
- Your value proposition must be appealing enough to convince those you are trying to regulate that it is in their best interests to comply;
- You should ensure that you have adequate resources for inspection, monitoring, and enforcement;
- You don’t allow so many loopholes that enable the ad hoc creation of classes or parties who can blatantly evade compliance; and
- You don’t reward illicit behavior. Read more
Filed under: Data Governance, Data Profiling, Data Quality, Master Data
I am honored to be one of the co-moderators of the upcoming TDWI Solution Summit on Master Data, Quality, and Governance, to be held March 4-6 in Savannah, GA. I attended one of these events in the past, met a lot of people currently engaged in launching an MDM project, but this year I have helped to find some cases studies that focus on master data as a result of good data quality and governance techniques, not the driver. Please go to the web site and check it out, then apply to be a delegate! Hope to see you there!
Just got back from the Data Governance and Data Quality conference in San Diego, and here are some quick reflections:
If you believe that there is business value in creating a unified view of a particular data concept and therefore create a master data asset for it, what would the first step be? I suggest that you would need to propose the creation of the master data domain to a data governance committee, who would evaluate the costs and benefits prior to initiating the sequence of tasks needed to materialize a master repository.
I made up that word, and it stands for “funding conundrum.” I was actually starting to think about this after my other recent post on data quality and dieting. It is the problem of establishing continuity in good data quality practices long after the issues are gone.
Let’s say that our company is directly impacted by a number of data quality issues where errors lead to unexpected or undesirable results. After quantifying the negative impact and doing a cost/benefit analysis for an investment in data quality consultation, processes, and tools, we are able to determine that there is value in finding and fixing the failed processes that introduce errors into the environment. We are good to go…