4.6 Article

Category kappas for agreement between fuzzy classifications

Journal

NEUROCOMPUTING
Volume 194, Issue -, Pages 385-388

Publisher

ELSEVIER SCIENCE BV
DOI: 10.1016/j.neucom.2016.02.038

Keywords

Cohen's kappa; Fuzzy classification; Category statistics; MRI; Brain tissue; Interobserver agreement

Ask authors/readers for more resources

The kappa statistic is a widely used as a measure for quantifying agreement between two nominal classifications. The statistic has been extended to the case of two normalized fuzzy classifications. In this paper we define category kappas for quantifying agreement on a particular category of two normalized fuzzy classifications. The overall fuzzy kappa is a weighted average of the proposed category kappas. Since the value of the overall kappa lies between the minimum and maximum values of the category kappas, the overall kappa, in a way, summarizes the agreement reflected in the category kappas. The overall kappa meaningfully reflects the degree of agreement between the fuzzy classifications if the category kappas are approximately equal. If this is not the case, it is more informative to report the category kappas. (C) 2016 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available