4.5 Article

Better to be in agreement than in bad company A critical analysis of many kappa-like tests

Journal

BEHAVIOR RESEARCH METHODS
Volume 55, Issue 7, Pages 3326-3347

Publisher

SPRINGER
DOI: 10.3758/s13428-022-01950-0

Keywords

Agreement coefficient; Contingency table; Categorical data analysis; Inter-rater reliability

Ask authors/readers for more resources

In this study, we assessed several agreement coefficients applied in 2x2 contingency tables commonly used in research using dichotomization. We developed a general method to evaluate any estimator candidate for agreement measurement and found that Cohen's kappa is a measurement of association while McNemar's chi-squared cannot assess association or agreement. We concluded that Holley and Guilford's G and Gwet's AC1 are the two authentic agreement estimators with the best performance over a range of table sizes.
We assessed several agreement coefficients applied in 2x2 contingency tables, which are commonly applied in research due to dichotomization. Here, we not only studied some specific estimators but also developed a general method for the study of any estimator candidate to be an agreement measurement. This method was developed in open-source R codes and it is available to the researchers. We tested this method by verifying the performance of several traditional estimators over all possible configurations with sizes ranging from 1 to 68 (total of 1,028,789 tables). Cohen's kappa showed handicapped behavior similar to Pearson's r, Yule's Q, and Yule's Y. Scott's pi, and Shankar and Bangdiwala's B seem to better assess situations of disagreement than agreement between raters. Krippendorff's alpha emulates, without any advantage, Scott's pi in cases with nominal variables and two raters. Dice's F1 and McNemar's chi-squared incompletely assess the information of the contingency table, showing the poorest performance among all. We concluded that Cohen's kappa is a measurement of association and McNemar's chi-squared assess neither association nor agreement; the only two authentic agreement estimators are Holley and Guilford's G and Gwet's AC1. The latter two estimators also showed the best performance over the range of table sizes and should be considered as the first choices for agreement measurement in contingency 2x2 tables. All procedures and data were implemented in R and are available to download from Harvard Dataverse https://doi.org/10.7910/DVN/HMYTCK.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available