4.1 Article

Agreement between an isolated rater and a group of raters

期刊

STATISTICA NEERLANDICA
卷 63, 期 1, 页码 82-100

出版社

WILEY-BLACKWELL PUBLISHING, INC
DOI: 10.1111/j.1467-9574.2008.00412.x

关键词

kappa coefficient; nominal scale; ordinal scale; expert group

向作者/读者索取更多资源

The agreement between two raters judging items on a categorical scale is traditionally assessed by Cohen's kappa coefficient. We introduce a new coefficient for quantifying the degree of agreement between an isolated rater and a group of raters on a nominal or ordinal scale. The group of raters is regarded as a whole, a reference or gold-standard group with its own heterogeneity. The coefficient, defined on a population-based model, requires a specific definition of the concept of perfect agreement. It has the same properties as Cohen's kappa coefficient and reduces to the latter when there is only one rater in the group. The new approach overcomes the problem of consensus within the group of raters and generalizes Schouten's index. The method is illustrated on published syphilis data and on data collected from a study assessing the ability of medical students in diagnostic reasoning when compared with expert knowledge.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.1
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据