4.7 Article

ML-FOREST: A Multi-Label Tree Ensemble Method for Multi-Label Classification

期刊

出版社

IEEE COMPUTER SOC
DOI: 10.1109/TKDE.2016.2581161

关键词

Multi-label classification; label dependency; label transfer; tree classifier; ensemble methods

资金

  1. Guangzhou Key Laboratory of Robotics and Intelligent Software [15180007]
  2. National Natural Science Foundation of China (NSFC) [61005061]
  3. Guangdong Natural Science Foundation [2016A030313479]
  4. Fundamental Research Funds for the Central Universities [D215048w, 2015ZZ029]
  5. HKRGC CRF [C1007-15GF]
  6. HKRGC GRF HKBU [12302715]

向作者/读者索取更多资源

Multi-label classification deals with the problem where each example is associated with multiple class labels. Since the labels are often dependent to other labels, exploiting label dependencies can significantly improve the multi-label classification performance. The label dependency in existing studies is often given as prior knowledge or learned from the labels only. However, in many real applications, such prior knowledge may not be available, or labeled information might be very limited. In this paper, we propose a new algorithm, called ML-FOREST, to learn an ensemble of hierarchical multi-label classifier trees to reveal the intrinsic label dependencies. In ML-FOREST, we construct a set of hierarchical trees, and develop a label transfer mechanism to identify the multiple relevant labels in a hierarchical way. In general, the relevant labels at higher levels of the trees capture more discriminable label concepts, and they will be transferred into lower level children nodes that are harder to discriminate. The relevant labels in the hierarchy are then aggregated to compute label dependency and make the final prediction. Our empirical study shows encouraging results of the proposed algorithm in comparison with the state-of-the-art multi-label classification algorithms under Friedman test and post-hoc Nemenyi test.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据