banner image for William G. Smith & Associates

BUILDING IN DATA QUALITY

Most IS/IT organizations are motivated and measured by developing or purchasing and installing whatever computer applications are desired by any part of the enterprise. This inevitably results in significant amounts of redundantly stored data, and even more redundant program code. Data quality is also endemically poor due to inconsistent data definitions, and edit/validation rules/logic - different Create, Update, and Delete transactions in different applications will typically apply different edit and validation logic to the data residing within that application's data stores. The IRM approach emphasizes the building of shared, single-source, subject-oriented databases, in which the quality of the data (relative to the typical IS/IT approach) is extremely easy to optimize and control.

The three main sources of poor data quality are:

This seminar details how to remedy these three main sources. Detailed examples of correct data definitions in the Conceptual Data Model (entity definitions), the Logical Data Model (data element definitions), and the Physical Data Model (data field/column definitions) are presented, along with the specifications of the corresponding Conceptual Create, Update, and Delete transactions in the Conceptual Transaction Model, the Logical Transaction Model, and the Physical Transaction Model. The principle of factoring out shared logic between Create and Update transactions is demonstrated. Finally, the seminar demonstrates the principle of properly designing the "gradecards" of individuals who are the creators, updaters, and deleters of data, so that they are rewarded for the correct and accurate capture, update, and deletion of data, not punished for it.

 
TOPICAL OUTLINE

DURATION: 1 day with no workshop; days may be added for client-specified real-life workshop

TARGETED AUDIENCES:

PREREQUISITES: (Recommended)


RETURN TO SEMINARS PAGE

RETURN TO TOP OF PAGE

© 2013 WILLIAM G. SMITH