4.7 Article

Multi-instance multi-label learning

Journal

ARTIFICIAL INTELLIGENCE
Volume 176, Issue 1, Pages 2291-2320

Publisher

ELSEVIER
DOI: 10.1016/j.artint.2011.10.002

Keywords

Machine learning; Multi-instance multi-label learning; MIML; Multi-label learning; Multi-instance learning

Funding

  1. National Fundamental Research Program of China [2010CB327903]
  2. National Science Foundation of China [61073097, 61021062]

Ask authors/readers for more resources

In this paper, we propose the MIML (Multi-Instance Multi-Label learning) framework where an example is described by multiple instances and associated with multiple class labels. Compared to traditional learning frameworks, the MIML framework is more convenient and natural for representing complicated objects which have multiple semantic meanings. To learn from MIML examples, we propose the MIMLBOOST and MIMLSVM algorithms based on a simple degeneration strategy, and experiments show that solving problems involving complicated objects with multiple semantic meanings in the MIML framework can lead to good performance. Considering that the degeneration process may lose information, we propose the D-MIMLSVM algorithm which tackles MIMI, problems directly in a regularization framework. Moreover, we show that even when we do not have access to the real objects and thus cannot capture more information from real objects by using the MIML representation, MIML is still useful. We propose the INSDIF and SUBCOD algorithms. INSDIF works by transforming single-instances into the MIML representation for learning, while SUBCOD works by transforming single-label examples into the MIML representation for learning. Experiments show that in some tasks they are able to achieve better performance than learning the single-instances or single-label examples directly. (C) 2011 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available