4.7 Article

Block-cyclic stochastic coordinate descent for deep neural networks

期刊

NEURAL NETWORKS
卷 139, 期 -, 页码 348-357

出版社

PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.neunet.2021.04.001

关键词

Coordinate descent; Deep neural network; Energy optimization; Stochastic gradient descent

资金

  1. National Research Foundation of Korea (NRF), Republic of Korea [2017R1A2B4006023, 2019K1A3A1A77074958]
  2. Office of Naval Research (ONR), USA [N00014-19-1-2229]
  3. National Research Foundation of Korea [2019K1A3A1A77074958, 2017R1A2B4006023] Funding Source: Korea Institute of Science & Technology Information (KISTI), National Science & Technology Information Service (NTIS)

向作者/读者索取更多资源

BCSC is a stochastic first-order optimization algorithm that adds a cyclic constraint to the selection of data and parameters, resulting in higher accuracy in image classification. It effectively limits the impact of outliers in the training set and provides better generalization performance within the same number of update iterations.
We present a stochastic first-order optimization algorithm, named block-cyclic stochastic coordinate descent (BCSC), that adds a cyclic constraint to stochastic block-coordinate descent in the selection of both data and parameters. It uses different subsets of the data to update different subsets of the parameters, thus limiting the detrimental effect of outliers in the training set. Empirical tests in image classification benchmark datasets show that BCSC outperforms state-of-the-art optimization methods in generalization leading to higher accuracy within the same number of update iterations. The improvements are consistent across different architectures and datasets, and can be combined with other training techniques and regularizations. (C) 2021 Elsevier Ltd. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据