4.6 Article

DMIB: Dual-Correlated Multivariate Information Bottleneck for Multiview Clustering

Journal

IEEE TRANSACTIONS ON CYBERNETICS
Volume 52, Issue 6, Pages 4260-4274

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TCYB.2020.3025636

Keywords

Correlation; Clustering algorithms; Clustering methods; Convergence; Bayes methods; Cybernetics; Reliability; Multivariate information bottleneck (MIB); multiview clustering (MVC); unsupervised learning

Funding

  1. National Natural Science Foundation of China [61772475]
  2. National Key Research and Development Plan Advanced Rail Transit [2018YFB1201403]

Ask authors/readers for more resources

In this study, a novel method for multiview clustering is proposed, which can explore both interfeature correlations and intercluster correlations to improve clustering performance.
Multiview clustering (MVC) has recently been the focus of much attention due to its ability to partition data from multiple views via view correlations. However, most MVC methods only learn either interfeature correlations or intercluster correlations, which may lead to unsatisfactory clustering performance. To address this issue, we propose a novel dual-correlated multivariate information bottleneck (DMIB) method for MVC. DMIB is able to explore both interfeature correlations (the relationship among multiple distinct feature representations from different views) and intercluster correlations (the close agreement among clustering results obtained from individual views). For the former, we integrate both view-shared feature correlations discovered by learning a shared discriminative feature subspace and view-specific feature information to fully explore the interfeature correlation. This allows us to attain multiple reliable local clustering results of different views. Following this, we explore the intercluster correlations by learning the shared mutual information over different local clusterings for an improved global partition. By integrating both correlations, we formulate the problem as a unified information maximization function and further design a two-step method for optimization. Moreover, we theoretically prove the convergence of the proposed algorithm, and discuss the relationships between our method and several existing clustering paradigms. The experimental results on multiple datasets demonstrate the superiority of DMIB compared to several state-of-the-art clustering methods.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available