Filed under: Business Intelligence, Data Analysis, Data Integration, Metadata, Recommendations, Replication
Over time, organizations have employed a variety of strategies for managing data assets in accordance with the specific needs of the different business applications in operation. For the most part, applications were designed to achieve specific objectives within each business function. Correspondingly, any data necessary for the business function would be managed locally, while any data deemed critical to the organization would be subsumed into a centralized repository.
Yet this approach to centralizing data has come under scrutiny. Read more
Filed under: Business Intelligence, Data Analysis, Data Governance, Data Integration, Data Quality
In my last post, we started to discuss the need for fundamental processes and tools for institutionalizing data testing. While the software development practice has embraced testing as a critical gating factor for the release of newly developed capabilities, this testing often centers on functionality, sometimes to the exclusion of a broad-based survey of the underlying data asset to ensure that values did not (or would not) incorrectly change as a result.
In fact, the need for testing existing production data assets goes beyond the scope of newly developed software. Modifications are constantly applied within an organization – acquired applications are upgraded, internal operating environments are enhanced and updated, additional functionality is turned on and deployed, hardware systems are swapped out and in, and internal processes may change. Yet there are limitations in effectively verifying that interoperable components that create, touch, or modify data are not impacted. The challenge of maintaining consistency across the application infrastructure can be daunting, let alone assuring consistency in the information results. Read more
Filed under: Business Intelligence, Data Analysis, Data Quality
On August 25, I will be presenting a Webinar in whichI will discuss concepts of geography, spatial analysis, location intelligence, and other aspects of the integration of location information as part of a business intelligence and analytics program. In this webinar I will discuss the basics of geographic data and spatial analysis, and how to extract knowledge based on geographic information and exploit it to drive efficiencies and profitability.
Live Webinar: Using Geographic and Spatial Data to Improve Business Analytics
When: August 25, 2011, Noon ET, 9:00 a.m. PT
You will learn:
- About geographic data and spatial analysis
- Processes for integrating geographic data
- Best practices for data integration and data quality
This Webinar has limited capacity and will fill up quickly, so register early to ensure your spot.
Filed under: Business Intelligence, Data Analysis, Data Governance, Data Profiling, Data Quality, Master Data, Metadata
We are pleased to announce that our teaming partner CORMAC has been awarded the Data Management (DM) five year IDIQ contract by Centers for Medicare and Medicaid Services (CMS), an agency under Health and Human Services (HHS). As a teaming partner, Knowledge Integrity will work with CORMAC in support of data administration task orders. The total contract value is $350 million.
The purpose of this contract is to provide the following services for CMS:
1. Data Administration
2. Database Administration
3. Middleware and Message Queuing Administration
4. Business Intelligence (BI) and Extract-Transformation-Load (ETL) Tool Administration and Production Support
5. Data and Information Products Production Support
6. BI Tool, Data and Information Products, and BI-Related ETL Build and Deploy Services
7. Training/User Support
8. EDG Legacy Application and Database Maintenance Services
9. IDR ETL Development, View Development and Database Administration
Filed under: Business Impacts, Business Intelligence, Data Analysis, Data Quality
The Practitioner’s Guide to Data Quality Improvement has been included in the IT feed at Alltop!