4.7 Article

Step-wise integration of deep class-specific learning for dermoscopic image segmentation

期刊

PATTERN RECOGNITION
卷 85, 期 -, 页码 78-89

出版社

ELSEVIER SCI LTD
DOI: 10.1016/j.patcog.2018.08.001

关键词

Dermoscopic; Melanoma; Segmentation; Fully convolutional networks (FCN)

资金

  1. Australia Research Council (ARC) [LP140100686]
  2. Australian Research Council [LP140100686] Funding Source: Australian Research Council

向作者/读者索取更多资源

The segmentation of abnormal regions on dermoscopic images is an important step for automated computer aided diagnosis (CAD) of skin lesions. Recent methods based on fully convolutional networks (FCN) have been very successful for dermoscopic image segmentation. However, they tend to overfit to the visual characteristics that are present in the dominant non-melanoma studies and therefore, perform poorly on the complex visual characteristics exhibited by melanoma studies, which usually consists of fuzzy boundaries and heterogeneous textures. In this paper, we propose a new method for automated skin lesion segmentation that overcomes these limitations via a novel deep class-specific learning approach which learns the important visual characteristics of the skin lesions of each individual class (melanoma vs. non-melanoma) on an individual basis. We also introduce a new probability-based, step-wise integration to combine complementary segmentation results derived from individual class-specific learning models. We achieved an average Dice coefficient of 85.66% on the ISBI 2017 Skin Lesion Challenge (SLC), 91.77% on the ISBI 2016 SLC and 92.10% on the PH2 datasets with corresponding Jaccard indices of 77.73%, 85.92% and 85.90%, respectively, for the same datasets. Our experiments on three well-established public benchmark datasets demonstrate that our method is more effective than other state-of-the-art methods for skin lesion segmentation. (C) 2018 Elsevier Ltd. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据