4.7 Article

Riemannian gradient methods for stochastic composition problems

相关参考文献

注意:仅列出部分参考文献,下载原文获取全部文献信息。
Review Mathematics, Applied

DISTRIBUTIONALLY ROBUST OPTIMIZATION: A REVIEW ON THEORY AND APPLICATIONS

Fengming Lin et al.

Summary: This paper surveys the primary research on the theory and applications of distributionally robust optimization (DRO). It reviews the modeling power and computational attractiveness of DRO approaches, summarizes efficient solution methods, performance guarantees, and convergence analysis. Additionally, the paper illustrates applications of DRO in machine learning and operations research, and discusses future research directions.

NUMERICAL ALGEBRA CONTROL AND OPTIMIZATION (2022)

Article Computer Science, Artificial Intelligence

Orthogonal Deep Neural Networks

Shuai Li et al.

Summary: This paper introduces the algorithms of Orthogonal Deep Neural Networks (OrthDNNs) to improve generalization performance by connecting with recent interest in spectrally regularized deep learning methods. Theoretical analyses and experiments demonstrate that OrthDNNs can achieve local isometric properties on practical data distributions, leading to better optimization of network weights. Proposed algorithms, including strict and approximate OrthDNNs, along with the SVB and BBN methods, show effective and efficient performance in benchmark image classification tasks.

IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE (2021)

Article Engineering, Electrical & Electronic

Solving Stochastic Compositional Optimization is Nearly as Easy as Solving Stochastic Optimization

Tianyi Chen et al.

Summary: Stochastic compositional optimization generalizes classic stochastic optimization for minimizing compositions of functions, with applications in reinforcement learning and meta learning. The new Stochastically Corrected Stochastic Compositional gradient method (SCSC) ensures convergence at the same rate as traditional methods and can be accelerated with SGD techniques. Applying Adam to SCSC achieves state-of-the-art performance in stochastic compositional optimization, tested in model-agnostic meta-learning tasks.

IEEE TRANSACTIONS ON SIGNAL PROCESSING (2021)

Article Computer Science, Artificial Intelligence

Gradient-based Learning Methods Extended to Smooth Manifolds Applied to Automated Clustering

Alkis Koudounas et al.

JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH (2020)

Article Mathematics, Applied

RIEMANNIAN STOCHASTIC VARIANCE REDUCED GRADIENT ALGORITHM WITH RETRACTION AND VECTOR TRANSPORT

Hiroyuki Sato et al.

SIAM JOURNAL ON OPTIMIZATION (2019)

Article Computer Science, Information Systems

Complete Dictionary Recovery Over the Sphere II: Recovery by Riemannian Trust-Region Method

Ju Sun et al.

IEEE TRANSACTIONS ON INFORMATION THEORY (2017)

Article Mathematics, Applied

A BROYDEN CLASS OF QUASI-NEWTON METHODS FOR RIEMANNIAN OPTIMIZATION

Wen Huang et al.

SIAM JOURNAL ON OPTIMIZATION (2015)

Article Engineering, Electrical & Electronic

Empirical Arithmetic Averaging Over the Compact Stiefel Manifold

Tetsuya Kaneko et al.

IEEE TRANSACTIONS ON SIGNAL PROCESSING (2013)

Article Mathematics, Applied

LOW-RANK MATRIX COMPLETION BY RIEMANNIAN OPTIMIZATION

Bart Vandereycken

SIAM JOURNAL ON OPTIMIZATION (2013)

Article Computer Science, Artificial Intelligence

Extended Hamiltonian Learning on Riemannian Manifolds: Theoretical Aspects

Simone Fiori

IEEE TRANSACTIONS ON NEURAL NETWORKS (2011)

Article Computer Science, Artificial Intelligence

Learning by Natural Gradient on Noncompact Matrix-Type Pseudo-Riemannian Manifolds

Simone Fiori

IEEE TRANSACTIONS ON NEURAL NETWORKS (2010)

Article Computer Science, Artificial Intelligence

Lie-group-type neural system learning by manifold retractions

Simone Fiori

NEURAL NETWORKS (2008)