Professional Documents
Culture Documents
Ahmad Tarabaih
Assistant Professor
Community Services
Lesson Plan
• Subject 3: Reliability and validity of data
– Training and calibrating examiners
– Duplicate examinations
2
Community Services
disease
9
Community Services
Intra examiner Reproducibility
• The examiner should then determine how
consistently he/she can apply the diagnostic
criteria by examining a group of 25 subjects twice,
ideally on successive days, or with a time interval
of at least 30 minutes between examinations.
• The subjects should be pre-selected so that they
collectively represent the full range of conditions
expected to be assessed in the actual survey.
10
Community Services
Inter examiner Reproducibility
• Assessment the consistency of between
examiners.
• Each examiner should first practice the
examination on a group of 10 subjects, then
independently examine the same group of 20
or more subjects, and compare his/her findings
with those of the other examiners in the team.
11
Community Services
Inter examiner Reproducibility
• In case of major discrepancies, subjects are re-examined
so inter examiner differences are reviewed and resolved
by group discussion.
• The group must examine with rescannable consistency
using a common standard.
• Consistency level should range between 85-95%.
• Exclude from survey team persons with consistently
significant different from the majority of team.
12
Community Services
Inter examiner Reproducibility
• Failure to have assessment in a consistent
manner, will lead to missed or wrongly
interpreted disease prevalence or severity.
• In actual survey, examiners should all examine
similar proportions of each major subgroup of
the sample population, because there will be
always some variations between examiners.
13
Community Services
Estimating reproducibility of
recordings
• Inter and Intra examiner consistency can be
assessed in number of ways:
– The percentage of agreement between scores
• Number of agreement scores/ total scores
17
Community Services
Estimating reproducibility of
recordings
• Example question:
– The following hypothetical data comes from a medical test
where two radiographers rated 50 images for needing
further study. The researchers (A and B) either said Yes (for
further study) or No (No further study needed).
• 20 images were rated Yes by both.
• 15 images were rated No by both.
• Overall, examiner A said Yes to 25 images and No to 25.
• Overall, examiner B said Yes to 30 images and No to 20
18
Community Services
Estimating reproducibility of
recordings
• Step 1:
– Calculate po (the observed proportional agreement):
20 images were rated Yes by both.
15 images were rated No by both.
19
Community Services
Estimating reproducibility of
recordings
• Step 2:
– Find the probability that the examiners would randomly
both say Yes.
Examiner A said Yes to 25/50 images, or 50%(0.5).
Examiner B said Yes to 30/50 images, or 60%(0.6).
22
Community Services
Estimating reproducibility of
recordings
• Step 5:
– Insert your calculations into the formula and solve:
23
Community Services
Estimating reproducibility of
recordings
• The Kappa statistic varies from 0 to 1, where.
– 0 = agreement equivalent to chance.
– 0.1 – 0.20 = slight agreement.
– 0.21 – 0.40 = fair agreement.
– 0.41 – 0.60 = moderate agreement.
– 0.61 – 0.80 = substantial agreement.
– 0.81 – 0.99 = near perfect agreement
– 1 = perfect agreement.
24