Thoughts from Informatica World 2016

This past May I had the opportunity to visit Informatica’s annual conference, Informatica World, and now that some time has passed, I thought it would be worth reflecting on three aspects of the experience. First I had the opportunity to share a presentation with Robert Shields about the criticality of data protection, and in particular I was able to convey the message about the importance of integrating data protection techniques within the framework of data governance and data stewardship. In fact, I have summarized some of those same points in an article I later wrote for TechTarget searchCompliance.

Second, I attended an executive briefing in which the new senior executives shared their thoughts and expectations for Informatica’s progress over the next year. As Informatica has recently been taken private by a private equity firm, it was good to have some visibility into their plans for how they intend to continue developing products and services that enable data utilization, especially beyond the enterprise’s firewall, as we see more organizations extending their application framework into the cloud.

Lastly, I had a brief opportunity to chat with Informatica CEO Anil Chakravarthy. It is refreshing to see a C-Level manager so directly engaged in both driving the corporate product landscape and setting high-level direction for the global organization. Overall, it was also interesting to see how the company is realigning its messaging with the big data and analytics communities. Clearly, the information economy is growing as more organizations are adopting newer data management and computation technologies like Hadoop, yet in our upcoming survey report on Hadoop productionalization, individuals at all types of companies still see Hadoop integration with established enterprise componentry as well as the enterprise data architecture to be challenging, if not very challenging. As a result, we suggest that vendors providing data management technologies continue to expand their product catalog to include tools that can simplify big data application development, and I see that Informatica’s trajectory is aligned with that sentiment.

Best Practices for Data Integration Testing Series – Instituting Good Practices for Data Testing

August 3, 2012 by · 2 Comments
Filed under: Data Governance, Data Profiling, Data Quality, Metadata 

I have been asked by folks at Informatica to share some thoughts about best practices for data integration, and this is the first of a series on data testing.

It is rare, if not impossible, to develop software that is completely free of errors and bugs. Early in my career as a software engineer, I spent a significant amount of time on “bug duty” – the task of looking at the list of reported product errors and evaluating them one-by-one to try to identify the cause of the bug and then come up with a plan for correcting the program so that the application error is eliminated. And the software development process is one that, over time, has been the subject of significant scrutiny in relation to product quality assurance.

In fact, the state of software quality and testing is quite mature. Well-defined processes have been accepted as general best practices, and there are organized methods for evaluating software quality methodology capabilities and maturity. Yet when all applications are a combination of programs applied to input data to generate output information, it is curious that the testing practices for data integration and sharing remain largely ungoverned manual procedures. Read more

TDWI Solution Summit: Master Data, Quality, and Governance

December 15, 2011 by · Leave a Comment
Filed under: Data Governance, Data Profiling, Data Quality, Master Data 

I am honored to be one of the co-moderators of the upcoming TDWI Solution Summit on Master Data, Quality, and Governance, to be held March 4-6 in Savannah, GA. I attended one of these events in the past, met a lot of people currently engaged in launching an MDM project, but this year I have helped to find some cases studies that focus on master data as a result of good data quality and governance techniques, not the driver. Please go to the web site and check it out, then apply to be a delegate! Hope to see you there!

Outline for a Business Justification for Data Quality Improvement

I am currently working on my presentation slides for the upcoming DataFlux IDEAS conference, and the topic I am discussing is building a business justification for data quality improvement. I intend to walk through the assessment process to understand specific business impacts of data flaws, then look at evaluating alternatives for improvement, and then developing a cost/benefit analysis.

Read more

Is Data Profiling a Commodity?

Here are some quick thoughts about the basic functionality of data profiling that make me wonder about the degree to which it has become a commodity capability. If so, then I have a few observations at the end to make folks think about what they are using profiling for.

Read more

Next Page »