4.6 Article

Demixed Sparse Principal Component Analysis Through Hybrid Structural Regularizers

期刊

IEEE ACCESS
卷 9, 期 -, 页码 103075-103090

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/ACCESS.2021.3098614

关键词

Task analysis; Principal component analysis; Optimization; Neurons; Neural activity; Decoding; Matrix decomposition; Dimensionality reduction; sparse principal component analysis; demixed principal component analysis; parallel proximal algorithms

资金

  1. National Key Research and Development Program of China [2018AAA0100500]
  2. Fundamental Research Funds for the Central Universities
  3. Jiangsu Provincial Key Laboratory of Network and Information Security [BM2003201]
  4. Key Laboratory of Computer Network and Information Integration of Ministry of Education of China [93K-9]

向作者/读者索取更多资源

The novel approach of demixed sparse principal component analysis (dSPCA) proposed in this research greatly improves the interpretability of multivariate data and addresses some issues existing in traditional SPCA. By optimizing the loss function to demix dependencies between various task parameters and feature responses, and accelerating the algorithm optimization, significant progress has been made in separating neural activity into different task parameters and visualizing the demixed components.
Recently, the sparse representation of multivariate data has gained great popularity in real-world applications like neural activity analysis. Many previous analyses for these data utilize sparse principal component analysis (SPCA) to obtain a sparse representation. However, l(0)-norm based SPCA suffers from non-differentiability and local optimum problems due to non-convex regularization. Additionally, extracting dependencies between task parameters and feature responses is essential for further analysis while SPCA usually generates components without demixing these dependencies. To address these problems, we propose a novel approach, demixed sparse principal component analysis (dSPCA), that relaxes the non-convex constraints into convex regularizers, e.g., l(1)-norm and nuclear norm, and demixes dependencies of feature response on various task parameters by optimizing the loss function with marginalized data. The sparse and demixed components greatly improve the interpretability of the multivariate data. We also develop a parallel proximal algorithm to accelerate the optimization for hybrid regularizers based on our method. We provide theoretical analyses for error bound and convergency. We apply our method on simulated datasets to evaluate its time cost, the ability to explain the demixed information, and the ability to recover sparsity for the reconstructed data. Finally, we successfully separate the neural activity into different task parameters like stimulus or decision, and visualize the demixed components based on the real-world dataset.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据