4.5 Article

Multilabel Dimensionality Reduction via Dependence Maximization

Journal

Publisher

ASSOC COMPUTING MACHINERY
DOI: 10.1145/1839490.1839495

Keywords

Dimensionality reduction; multilabel learning

Funding

  1. National Science Foundation of China [60635030, 60721002]
  2. Jiangsu Science Foundation [BK2008018]
  3. National Fundamental Research Program of China [2010CB327903]
  4. Jiangsu 333 High-Level Talent Cultivation Program

Ask authors/readers for more resources

Multilabel learning deals with data associated with multiple labels simultaneously. Like other data mining and machine learning tasks, multilabel learning also suffers from the curse of dimensionality. Dimensionality reduction has been studied for many years, however, multilabel dimensionality reduction remains almost untouched. In this article, we propose amultilabel dimensionality reduction method, MDDM, with two kinds of projection strategies, attempting to project the original data into a lower-dimensional feature space maximizing the dependence between the original feature description and the associated class labels. Based on the Hilbert-Schmidt Independence Criterion, we derive a eigen-decomposition problem which enables the dimensionality reduction process to be efficient. Experiments validate the performance of MDDM.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available