4.7 Article

Probabilistic learning vector quantization on manifold of symmetric positive definite matrices

Journal

NEURAL NETWORKS
Volume 142, Issue -, Pages 105-118

Publisher

PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.neunet.2021.04.024

Keywords

Probabilistic learning vector quantization; Learning vector quantization; Symmetric positive definite matrices; Riemannian geodesic distances; Riemannian manifold

Funding

  1. National Natural Science Foundation of China [61803369, 51679213]
  2. Natural Science Foundation of Liaoning Province of China [20180520025]
  3. Frontier Science Research Project of the Chinese Academy of Sciences [QYZDY-SSW-JSC005]
  4. National Key Research and Development Program of China [2019YFC1408501]
  5. Basic Public Welfare Research Plan of Zhejiang Province, China [LGF20E090004]
  6. EC [721463]
  7. Marie Curie Actions (MSCA) [721463] Funding Source: Marie Curie Actions (MSCA)

Ask authors/readers for more resources

This paper develops a new classification method for manifold-valued data in the framework of probabilistic learning vector quantization. By generalizing the algorithm to symmetric positive definite matrices equipped with Riemannian natural metric, the proposed method demonstrates superior performance on synthetic data, image data, and motor imagery EEG data. Through utilizing Riemannian distance and gradient descent, the probabilistic learning Riemannian space quantization algorithm is derived.
In this paper, we develop a new classification method for manifold-valued data in the framework of probabilistic learning vector quantization. In many classification scenarios, the data can be naturally represented by symmetric positive definite matrices, which are inherently points that live on a curved Riemannian manifold. Due to the non-Euclidean geometry of Riemannian manifolds, traditional Euclidean machine learning algorithms yield poor results on such data. In this paper, we generalize the probabilistic learning vector quantization algorithm for data points living on the manifold of symmetric positive definite matrices equipped with Riemannian natural metric (affine-invariant metric). By exploiting the induced Riemannian distance, we derive the probabilistic learning Riemannian space quantization algorithm, obtaining the learning rule through Riemannian gradient descent. Empirical investigations on synthetic data, image data , and motor imagery electroencephalogram (EEG) data demonstrate the superior performance of the proposed method. (C) 2021 The Author(s). Published by Elsevier Ltd.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available