Abstract
Kappa statistics, unweighted or weighted, are widely used for assessing interrater agreement. The weights of the weighted kappa statistics in particular are defined in terms of absolute and squared distances in ratings between raters. It is proposed that those weights can be used for assessment of interrater agreements. A closed form expectations and variances of the agreement statistics referred to as AI1 and AI2, functions of absolute and squared distances in ratings between two raters, respectively, are obtained. AI1 and AI2 are compared with the weighted and unweighted kappa statistics in terms of Type I Error rate, bias, and statistical power using Monte Carlo simulations. The AI1 agreement statistic performs better than the other agreement statistics.
Original language | English (US) |
---|---|
Pages (from-to) | 205-222 |
Number of pages | 18 |
Journal | Journal of Modern Applied Statistical Methods |
Volume | 7 |
Issue number | 1 |
DOIs | |
State | Published - May 2008 |
Keywords
- Bias
- Interrater agreement
- Kappa statistic
- Statistical power
- Type I Error rate
ASJC Scopus subject areas
- Statistics and Probability
- Statistics, Probability and Uncertainty