期刊
STATISTICAL METHODOLOGY
卷 9, 期 3, 页码 407-422出版社
ELSEVIER SCIENCE BV
DOI: 10.1016/j.stamet.2011.11.001
关键词
Inter-rater reliability; Ordinal agreement; g-agreement; Multiple raters; Cohen's kappa; Cohen's weighted kappa; Hubert's kappa; Mielke, Berry and Johnston's weighted kappa
资金
- Netherlands Organisation for Scientific Research [451-11-026]
Cohen's unweighted kappa and weighted kappa are popular descriptive statistics for measuring agreement between two raters on a categorical scale. With m >= 3 raters, there are several views in the literature on how to define agreement. We consider a family of weighted kappas for multiple raters using the concept of g-agreement (g = 2, 3, ... , m) which refers to the situation in which it is decided that there is agreement if g out of m raters assign an object to the same category. Given m raters, we may formulate m - 1 weighted kappas in this family, one for each type of g-agreement. We show that the m - 1 weighted kappas coincide if we use the weighting scheme proposed by Mielke et al. (2007) [31]. (C) 2011 Elsevier B.V. All rights reserved.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据