3.9 Article

Equivalences of weighted kappas for multiple raters

期刊

STATISTICAL METHODOLOGY
卷 9, 期 3, 页码 407-422

出版社

ELSEVIER SCIENCE BV
DOI: 10.1016/j.stamet.2011.11.001

关键词

Inter-rater reliability; Ordinal agreement; g-agreement; Multiple raters; Cohen's kappa; Cohen's weighted kappa; Hubert's kappa; Mielke, Berry and Johnston's weighted kappa

资金

  1. Netherlands Organisation for Scientific Research [451-11-026]

向作者/读者索取更多资源

Cohen's unweighted kappa and weighted kappa are popular descriptive statistics for measuring agreement between two raters on a categorical scale. With m >= 3 raters, there are several views in the literature on how to define agreement. We consider a family of weighted kappas for multiple raters using the concept of g-agreement (g = 2, 3, ... , m) which refers to the situation in which it is decided that there is agreement if g out of m raters assign an object to the same category. Given m raters, we may formulate m - 1 weighted kappas in this family, one for each type of g-agreement. We show that the m - 1 weighted kappas coincide if we use the weighting scheme proposed by Mielke et al. (2007) [31]. (C) 2011 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

3.9
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据