期刊
NEUROCOMPUTING
卷 168, 期 -, 页码 92-103出版社
ELSEVIER
DOI: 10.1016/j.neucom.2015.06.010
关键词
Feature selection; Multi-label learning; Mutual information; Max-dependency and min-redundancy
资金
- National Program on Key Basic Research Project of China [2013CB329304]
- National Natural Science Foundation of China [61303131, 61222210, 61379021]
- Program for New Century Excellent Talents in University [NCET-12-0399]
- Department of Education of Fujian Province [JA14192]
Multi-label learning deals with data belonging to different labels simultaneously. Like traditional supervised feature selection, multi-label feature selection also plays an important role in data mining, information retrieval, and machine learning. In this paper, we first consider the two factors of multi-label feature, feature dependency and feature redundancy. In particular, dependency implies the degree to which a candidate feature contributes to each label, and redundancy represents the information overlap between the candidate feature and the selected features under all labels. We then propose an evaluation measure that combines mutual information with a max-dependency and min-redundancy algorithm, which allows us to select superior feature subset for multi-label learning. Extensive experiments show that the proposed method can effectively select a good feature subset, and outperform some state-of-the-art approaches. (C) 2015 Elsevier B.V. All rights reserved.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据