4.6 Article

Multiview Latent Space Learning With Feature Redundancy Minimization

Journal

IEEE TRANSACTIONS ON CYBERNETICS
Volume 50, Issue 4, Pages 1655-1668

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TCYB.2018.2883673

Keywords

Dictionaries; Redundancy; Correlation; Minimization; Sparse matrices; Machine learning; Data models; Complementary information; Hilbert-Schmidt independence criterion (HSIC); latent space; multiview learning; redundancy minimization

Funding

  1. China Post-Doctoral Science Foundation [2016M601597]
  2. NSFC of China [61876107, 6151101179, 61602337, 61602246]
  3. 973 Plan of China [2015CB856004]
  4. NSF of Jiangsu Province [BK20171430]
  5. Fundamental Research Funds for the Central Universities [30918011319]
  6. Open Project of State Key Laboratory of Integrated Services Networks (Xidian University) [ISN19-03]
  7. Summit of the Six Top Talents Program [DZXX-027]
  8. Lift Program for Young Talents of Jiangsu Province
  9. CAST Lift Program for Young Talents

Ask authors/readers for more resources

Multiview learning has received extensive research interest and has demonstrated promising results in recent years. Despite the progress made, there are two significant challenges within multiview learning. First, some of the existing methods directly use original features to reconstruct data points without considering the issue of feature redundancy. Second, existing methods cannot fully exploit the complementary information across multiple views and meanwhile preserve the view-specific properties; therefore, the degraded learning performance will be generated. To address the above issues, we propose a novel multiview latent space learning framework with feature redundancy minimization. We aim to learn a latent space to mitigate the feature redundancy and use the learned representation to reconstruct every original data point. More specifically, we first project the original features from multiple views onto a latent space, and then learn a shared dictionary and view-specific dictionaries to, respectively, exploit the correlations across multiple views as well as preserve the view-specific properties. Furthermore, the Hilbert-Schmidt independence criterion is adopted as a diversity constraint to explore the complementarity of multiview representations, which further ensures the diversity from multiple views and preserves the local structure of the data in each view. Experimental results on six public datasets have demonstrated the effectiveness of our multiview learning approach against other state-of-the-art methods.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available