4.6 Article

Category kappas for agreement between fuzzy classifications

期刊

NEUROCOMPUTING
卷 194, 期 -, 页码 385-388

出版社

ELSEVIER SCIENCE BV
DOI: 10.1016/j.neucom.2016.02.038

关键词

Cohen's kappa; Fuzzy classification; Category statistics; MRI; Brain tissue; Interobserver agreement

向作者/读者索取更多资源

The kappa statistic is a widely used as a measure for quantifying agreement between two nominal classifications. The statistic has been extended to the case of two normalized fuzzy classifications. In this paper we define category kappas for quantifying agreement on a particular category of two normalized fuzzy classifications. The overall fuzzy kappa is a weighted average of the proposed category kappas. Since the value of the overall kappa lies between the minimum and maximum values of the category kappas, the overall kappa, in a way, summarizes the agreement reflected in the category kappas. The overall kappa meaningfully reflects the degree of agreement between the fuzzy classifications if the category kappas are approximately equal. If this is not the case, it is more informative to report the category kappas. (C) 2016 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据