4.7 Article

A Stochastic Quasi-Newton Method for Large-Scale Nonconvex Optimization With Applications

期刊

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TNNLS.2019.2957843

关键词

Optimization; Stochastic processes; Convergence; Logistics; Machine learning algorithms; Linear programming; Machine learning; Damped parameter; limited memory BFGS (LBFGS); nonconjugate exponential models; nonconvex optimization; stochastic quasi-Newton (SQN) method; variational inference

向作者/读者索取更多资源

Ensuring the positive definiteness and avoiding ill conditioning of the Hessian update in the stochastic BroydenFletcher-Goldfarb-Shanno (BFGS) method are significant in solving nonconvex problems. This article proposes a novel stochastic version of a damped and regularized BFGS method for addressing the above problems. While the proposed regularized strategy helps to prevent the BFGS matrix from being close to singularity, the new damped parameter further ensures the positivity of the product of correction pairs. To alleviate the computational cost of the stochastic limited memory BFGS (LBFGS) updates and to improve its robustness, the curvature information is updated using the averaged iterate at spaced intervals. The effectiveness of the proposed method is evaluated through the logistic regression and Bayesian logistic regression problems in machine learning. Numerical experiments are conducted by using both synthetic data set and several real data sets. The results show that the proposed method generally outperforms the stochastic damped LBFGS (SdLBFGS) method. In particular, for problems with small sample sizes, our method has shown superior performance and is capable of mitigating ill-conditioned problems. Furthermore, our method is more robust to the variations of the batch size and memory size than the SdLBFGS method.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据