Professional Documents
Culture Documents
Data Quality Management: Abstraction
Data Quality Management: Abstraction
Introduction:
Abstraction:
Acquisition of data
Maintenance of data
Once entered in the system, customer data must be maintained. At this point,
what is data quality management and what does it involve? Essentially, you are
trying to keep associates who work with the data regularly from entering errors,
and you are striving to make sure that contact information changes are properly
recorded so that you do not miss your customers when they move. Data hygiene
services that verify current data and check for duplications are vital at this point
of data management.
Many DQM systems follow a similar procedural design, although they use
different terms. At the root of DQM are its people, often including:
• A data steward responsible for more focused data management, defining rules,
etc.
• A data manager or analyst responsible for the nuts and bolts — qualifying data
needs and quantifying them into the processes and systems they are measured
by.
All the basic operations of a business are managed quickly and efficiently when
the data has been managed properly. High quality data enhances decision
making at all levels of operations and management.
Low quality data in an organization means resources including finances are used
inefficiently. When businesses maintain data quality through DQM practices
saves them from wastage of resources leading to bigger and better results.
Competitive advantage
A good DQM makes use of a system that has various features that will help in
improving the trustworthiness of organizational data. Let us outline the various
features of a good DQM;
Data quality metrics are very important in assessing the efforts made to increase
the quality of your data. Data quality metrics must be top-notch and must be
clearly defined. In the data quality metrics, be sure to look out for; accuracy,
consistency, completeness, integrity, and timeliness. Let us discuss different
categories of data quality metrics and what they hold in;
Accuracy
Data accuracy refers to the degree to which the said data accurately reflects an
event or object that is described.
Completeness
Consistency
Data consistency simply specifies that two data values retrieved from multiple
and separate data sets should in no way conflict with each other. However, data
consistency does not necessarily imply that the data is correct
Integrity
Also referred to as data validation, data integrity refers to structurally testing data
to ensure compliance with an organization’s data procedures. Such data shows
that it has no unintended errors, and that it corresponds to its appropriate data
types.
Timeliness
When your data isn’t ready when users need it, it fails to fulfill the data quality
dimension of timeliness.
Some examples of data metrics that help an organization to measure data quality
efforts include;
This data metric allows tracking of the number of known errors within a data set
corresponding to the actual size of the data set.
This metric counts the number of times there is an empty field within a data set.
Empty values usually indicate missing information or information recorded in the
wrong field.
Data time-to-value
This metric evaluates how long it takes to gain meaningful insights from a data
set.
This metric will track how often a data transformation operation will fail.
If an organization stores data without using it, this could be an indication that the
data is of low quality. Conversely, if the organization’s data storage costs decline
while the data operations stay the same or continue to grow, the quality of the
data is most likely improving.
Conclusion
Research organizations worldwide are using data on research input and output,
that is, publications, patents, research data nowadays for a wide variety of use
purposes, such as evaluation, reporting and visualization of a researcher’ or
research organization’s expertise. This places high demands on the quality of the
data gathered for these purposes, which have—in most cases—largely outgrown
the initial intentions when the data systems were constructed. Moreover, the
research world has evolved in a global, dynamic manner in which research data
are increasingly being used in order to monitor the efficiency of research
processes, the research productivity and even strategic decision making. In order
to safeguard correct data analysis, research-related data must be assessed on
all relevant quality dimensions, and inaccuracies must be addressed using data
quality improvement trajectories as discussed in this chapter. The integration of a
data quality management policy, is the only way to ensure the fitness for use of
research-related data for various applications and business processes across the
research world as the impact of inaccurate date can have tremendous effects on
a researcher’s or research organization’s future prospects.
Reference
References:
Bauman, J. (2021). Data Quality Management: What You Need To Know. SAS Insights. Retrieved June 1,
2021 from as.com/en_ph/insights/articles/data-management/data-quality-management-what-you-
need-to-know.html?fbclid=IwAR0pPnZqRnqiZ4dZHw3tx0ghnPgTpuPTxdI5evtGjgBk-
CZWRw4SgyMYl5A#/.
Pembroke, D. (2019). The Importance of Data Quality Management. The Univeristy of North Carolina.
Retrieved June 2, 2021 from https://online.uncp.edu/articles/mba/the-importance-of-data-quality-
management.aspx?fbclid=IwAR0iBHLp_BDiUYHq9VA3-u0n2LHnNoJMAJ6hoaZ5i1VIadt-rIGRYrParVI.
Prepared by:
Bilgian Munos
Gemil Aleonar