You are on page 1of 1

School of Health Systems and Public Health

HME 712 Week 1 Fact sheet bos_2

Measuring agreement – Stata commands

Agreement vs. Association:

Agreement is when both measures are of the same variable in the same units on the same individuals

Association is when the variables need not be the same, the units need not be the same; but they
should both be measured on the same individuals.

If variables agree they are also associated. However, if variables are associated, they do not
necessarily agree (although they may do).

For Stata the data must be in the wide format with individual readings/ observations next to each
other for each individual study unit. This is for both Kappa and for the BA plot.

A table of “tests”: Association tests vs. Agreement tests


We might use the following tests or methods to evaluate association and to evaluate agreement. The choice of
tests is broadly dependent on the type of data variable (categorical or numerical).
Data type:
Categorical Numerical
Methods or Tests for association Odds ratio; Relative risk; Pearson’s correlation coefficient (r)
Incidence rate ratio; Spearman’s rank correlation coefficient (rs)
Relative proportion; Measuring the slope of a fitted line
Prevalence odds ratio;Chi square tests
Methods for assessing agreement Kappa statistic Method of Bland and Altman
Weighted Kappa statistic
McNemar’s test

In this first week we focussed on the Kappa statistic, weighted Kappa statistic, and the Method of Bland and
Altman (all in red type here).

Stata commands for measures of agreement

Procedure Stata commands Notes


Kappa kap reader_1 reader_2 The order is unimportant
Weighted Kappa kap reader_1 reader_2, wgt(w) Do not use “kappa” it will give the wrong answer
kap reader_1 reader_2, wgt(w2) w is a linear weighting; w2 is an exponential one
BA Plot gen diff = reader_1 – reader_2
gen average = (reader_1 + reader_2)/2
sum diff This gives the value of mean and s (SD)
di mean-(1.96*s) Use mean and s from the sum output
di mean+(1.96*s) Use mean and s from the sum output
scatter diff average Customise using the graph editor

You might also like