Filed under: Business Intelligence, Data Analysis, Data Governance, Data Integration, information strategy, Metadata
Even if in reality the dividing lines for data management are not always well-defined, it is possible to organize different aspects of information management within a virtual stack that suggests the interfaces and dependencies across different functional layers, which we will examine from the bottom – up.
Filed under: Business Intelligence, Data Analysis, Data Integration, Metadata, Recommendations, Replication
Over time, organizations have employed a variety of strategies for managing data assets in accordance with the specific needs of the different business applications in operation. For the most part, applications were designed to achieve specific objectives within each business function. Correspondingly, any data necessary for the business function would be managed locally, while any data deemed critical to the organization would be subsumed into a centralized repository.
Yet this approach to centralizing data has come under scrutiny. Read more
Filed under: Business Intelligence, Data Analysis, Data Governance, Data Integration, Data Quality
In my last post, we started to discuss the need for fundamental processes and tools for institutionalizing data testing. While the software development practice has embraced testing as a critical gating factor for the release of newly developed capabilities, this testing often centers on functionality, sometimes to the exclusion of a broad-based survey of the underlying data asset to ensure that values did not (or would not) incorrectly change as a result.
In fact, the need for testing existing production data assets goes beyond the scope of newly developed software. Modifications are constantly applied within an organization – acquired applications are upgraded, internal operating environments are enhanced and updated, additional functionality is turned on and deployed, hardware systems are swapped out and in, and internal processes may change. Yet there are limitations in effectively verifying that interoperable components that create, touch, or modify data are not impacted. The challenge of maintaining consistency across the application infrastructure can be daunting, let alone assuring consistency in the information results. Read more
After reading Jay Stanley’s ACLU article on “Eight Problems with Big Data,” it is worth reflecting on what could be construed as a fear-mongering indictment of the use of big data analytics and the implication that big data analytics and its implementation of data mining algorithms are tantamount to all-out invasion of privacy. What is interesting, though, is the presumption that privacy advocates have been “grappling” with data mining since “not long after 9/11,” yet data mining was already quite a mature discipline by that point in time, as was the general use of customer data for marketing, sales, and other business purposes. Raising an alarm about “big data” and “data mining” today is akin to shutting the barn door decades after the horses have bolted. Read more
Filed under: Data Analysis, Data Governance, Data Profiling, Data Quality
I am currently working on my presentation slides for the upcoming DataFlux IDEAS conference, and the topic I am discussing is building a business justification for data quality improvement. I intend to walk through the assessment process to understand specific business impacts of data flaws, then look at evaluating alternatives for improvement, and then developing a cost/benefit analysis.