4.5 Article

Interobserver agreement issues in radiology

期刊

DIAGNOSTIC AND INTERVENTIONAL IMAGING
卷 101, 期 10, 页码 639-641

出版社

ELSEVIER MASSON, CORP OFF
DOI: 10.1016/j.diii.2020.09.001

关键词

Reproducibility of results; Interobserver agreement; Radiology; Kappa test; Intraclass correlation coefficient

向作者/读者索取更多资源

Agreement between observers (i.e., inter-rater agreement) can be quantified with various criteria but their appropriate selections are critical. When the measure is qualitative (nominal or ordinal), the proportion of agreement or the kappa coefficient should be used to evaluate inter-rater consistency (i.e., inter-rater reliability). The kappa coefficient is more meaningful that the raw percentage of agreement, because the latter does not account for agreements due to chance alone. When the measures are quantitative, the intraclass correlation coefficient (ICC) should be used to assess agreement but this should be done with care because there are different ICCs so that it is important to describe the model and type of ICC being used. The Bland-Altman method can be used to assess consistency and conformity but its use should be restricted to comparison of two raters. (C) 2020 Societe francaise de radiologie. Published by Elsevier Masson SAS. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据