Solved – Strange values of Cohen’s kappa

cohens-kappa

I have two raters who agree on 93% on the cases (two options: yes or no). However, when calculating cohens kappa through crosstabs in spss I get really strange outcomes like -0.42 with a sig. of 0.677.

How can such a high agreement in percentage result in such a strange kappa?

I don´t get it.

Best Answer

Suppose there are 100 ratings. In 98 of them, both raters say "Yes". In 1, rater A says "no" and rater B says "yes"; in 1, just the opposite happens. Now kappa is near 0 (actually slightly below 0) but agreement is 98%. If you increase the '1's you lower both kappa and agreement.

Kappa is, in a way, designed to do this. It is intended to reduce the influence of matching that isn't indicative of anything.

Related Question