Journal
ADVANCES IN DATA ANALYSIS AND CLASSIFICATION
Volume 13, Issue 2, Pages 481-493Publisher
SPRINGER HEIDELBERG
DOI: 10.1007/s11634-018-0319-0
Keywords
Interrater reliability; Interobserver agreement; Category coefficients; 2 x 2 tables; Cohen's kappa
Categories
Ask authors/readers for more resources
Cohen's kappa is the most widely used coefficient for assessing interobserver agreement on a nominal scale. An alternative coefficient for quantifying agreement between two observers is Bangdiwala's B. To provide a proper interpretation of an agreement coefficient one must first understand its meaning. Properties of the kappa coefficient have been extensively studied and are well documented. Properties of coefficient B have been studied, but not extensively. In this paper, various new properties of B are presented. Category B-coefficients are defined that are the basic building blocks of B. It is studied how coefficient B, Cohen's kappa, the observed agreement and associated category coefficients may be related. It turns out that the relationships between the coefficients are quite different for 2x2 tables than for agreement tables with three or more categories.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available