Professional Documents
Culture Documents
COHEN’S KAPPA
A MEASURE OF INTER-RATER RELIABILITY
FOR QUALITATIVE RESEARCH INVOLVING NOMINAL CODING
WHAT IS COHEN’S KAPPA?
PR(E) = LIKLIHOOD
N = TOTAL NUMBER
THAT AGREEMENT IS
OF RATED ITEMS,
ATTRIBUTABLE TO
ALSO CALLED
CHANCE
“CASES”
THE EQUATION FOR K
PR(E) = LIKLIHOOD
N = TOTAL NUMBER
THAT AGREEMENT IS
OF RATED ITEMS,
ATTRIBUTABLE TO
ALSO CALLED
CHANCE
“CASES”
THE EQUATION FOR K
PR(E) = LIKLIHOOD
N = TOTAL NUMBER
THAT AGREEMENT IS
OF RATED ITEMS,
ATTRIBUTABLE TO
ALSO CALLED
CHANCE
“CASES”
CALCULATING K BY HAND USING
A CONTINGENCY TABLE
RATER 1
THE SIZE OF
R A B C THE TABLE IS
DETERMINED BY
A HOW MANY
T A CODING
CATEGORIES
E YOU HAVE
R B THIS EXAMPLE
2 ASSUMES THAT
YOUR UNITS
CAN BE SORTED
C INTO THREE
CATEGORIES,
HENCE A 3X3
GRID
CALCULATING K BY HAND USING
A CONTINGENCY TABLE
RATER 1 THE DIAGONAL
HIGHLIGHTED
R A B C HERE
REPRESENTS
A AGREEMENT
T A # of agreements on A disagreement disagreement (WHERE THE
TWO RATERS
E BOTH MARK THE
SAME THING)
R B disagreement # of agreements on B disagreement
Item # 1 2 3 4 5 6 7 8 9 10
Rater 1 R R R R R R R R R S
Rater 2 S R R O R R R R O S
Rater 3 R R R O R R O O R S
Rater 4 R R R R R R R R R S
Rater 5 S R R O R O O R R S
CALCULATING K FOR
RATERS 1 & 2
RATER 1
R R S O ADD
ROWS &
A 6
COLUMNS
R 0 0 6
T (Item #2,3, 4-8)
SINCE WE
E HAVE 10
ITEMS,
R S 1 1
0 2 THE
(Item #1) (Item #10)
2 TOTALS
SHOULD
ADD UP
O 2
(Item #4 & 9)
0 0 2 TO 10 FOR
EACH
9 1 0 10
CALCULATING K
COMPUTING SIMPLE AGREEMENT
RATER 1 ADD VALUES
OF DIAGONAL
R R S O CELLS &
DIVIDE BY
A 6
TOTAL
O 2
(Item #4 & 9)
0 0
(6+1)/10
THE EQUATION FOR K:
RATERS 1 & 2
5.4 + .2 + 0 =
2
2 0
5.6
O (Item #4 & 9)
0
(0)
THE EQUATION FOR K:
RATERS 1 & 2
Item # 1 2 3 4 5 6 7 8 9 10
Rater 1 R R R R R R R R R S
Rater 2 S R R O R R R R O S
Item # 1 2 3 4 5 6 7 8 9 10
Rater 1 R R R R R R R R R S
Rater 2 S R R O R R R R O S
Rater 3 R R R O R R O O R S
Rater 4 R R R R R R R R R S
Rater 5 S R R O R O O R R S
http://faculty.vassar.edu/lowry/kappa.html