4.7 Article

Flexible Affinity Matrix Learning for Unsupervised and Semisupervised Classification

Journal

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TNNLS.2018.2861839

Keywords

Affinity matrix; clustering; low-rank representation (LRR); sparse representation

Funding

  1. National Key R&D Program of China [2018YFB1003201]
  2. Key Research Program of Frontier Sciences, CAS [QYZDY-SSW-JSC044]
  3. Natural Science Foundation of China [61772141, 61773328, 61761130079]
  4. Hong Kong Polytechnic University [G-YBD9]
  5. Guangdong Provincial Natural Science Foundation [17ZK0422, 2018B030311007]
  6. Guangzhou Science and Technology Planning Project [201804010347]

Ask authors/readers for more resources

In this paper, we propose a unified model called flexible affinity matrix learning (FAML) for unsupervised and semisupervised classification by exploiting both the relationship among data and the clustering structure simultaneously. To capture the relationship among data, we exploit the self-expressiveness property of data to learn a structured matrix in which the structures are induced by different norms. A rank constraint is imposed on the Laplacian matrix of the desired affinity matrix, so that the connected components of data are exactly equal to the cluster number. Thus, the clustering structure is explicit in the learned affinity matrix. By making the estimated affinity matrix approximate the structured matrix during the learning procedure, FAML allows the affinity matrix itself to be adaptively adjusted such that the learned affinity matrix can well capture both the relationship among data and the clustering structure. Thus, FAML has the potential to perform better than other related methods. We derive optimization algorithms to solve the corresponding problems. Extensive unsupervised and semisupervised classification experiments on both synthetic data and real-world benchmark data sets show that the proposed FAML consistently outperforms the state-of-the-art methods.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available