4.5 Article

Oracle Complexity Separation in Convex Optimization

相关参考文献

注意:仅列出部分参考文献,下载原文获取全部文献信息。
Article Operations Research & Management Science

Zeroth-order methods for noisy Holder-gradient functions

Innokentiy Shibaev et al.

Summary: This paper proves new complexity bounds for zeroth-order methods in non-convex optimization with inexact observations of the objective function values. By using the Gaussian smoothing approach and extending it to functions with Holder-continuous gradients and noisy zeroth-order oracles, the paper obtains noise upper-bounds and provides results on convergence to stationary points of both smoothed and original functions. Additionally, bounds on the level of noise in the zeroth-order oracle are discussed, along with improvements in dimensional dependence for nu = 1.

OPTIMIZATION LETTERS (2022)

Article

An approach for the nonconvex uniformly concave structured saddle point problem

Mohammad S. Alkousa et al.

Computer Research and Modeling (2022)

Article Management

An accelerated directional derivative method for smooth stochastic convex optimization

Pavel Dvurechensky et al.

Summary: This study proposes a new algorithm for smooth stochastic convex optimization problems in the context of directional derivatives, and provides complexity bounds for both non-accelerated and accelerated methods. The complexity bound of the non-accelerated algorithm is similar to the gradient-based algorithm, while the accelerated algorithm has a complexity bound that is only increased by a factor of the square root of the problem dimension compared to the accelerated gradient-based algorithm.

EUROPEAN JOURNAL OF OPERATIONAL RESEARCH (2021)

Article Mathematics, Applied

Decentralized and parallel primal and dual accelerated methods for stochastic convex programming problems

Darina Dvinskikh et al.

Summary: This study introduces primal and dual stochastic gradient oracle methods for decentralized convex optimization problems, which are optimal in terms of communication steps but only achieve optimality in terms of oracle calls per node up to a logarithmic factor and smoothness. Using mini-batching technique, the proposed methods can be additionally parallelized at each node, making them applicable to many data science and inverse problems.

JOURNAL OF INVERSE AND ILL-POSED PROBLEMS (2021)

Article Computer Science, Software Engineering

Inexact model: a framework for optimization and variational inequalities

Fedor Stonyakin et al.

Summary: This paper introduces a general algorithmic framework for first-order methods in optimization, which can reproduce known methods and construct new ones. The framework is applicable to smooth and non-smooth problems without prior knowledge of the problem's smoothness.

OPTIMIZATION METHODS & SOFTWARE (2021)

Article Operations Research & Management Science

First-Order Methods for Convex Optimization

Pavel Dvurechensky et al.

Summary: This article discusses the importance and development of first-order methods in convex optimization problems, as well as some extensions and improvements such as the non-Euclidean proximal gradient method, projection methods, and proximal versions of primal-dual schemes.

EURO JOURNAL ON COMPUTATIONAL OPTIMIZATION (2021)

Proceedings Paper Automation & Control Systems

Derivative-Free Method For Composite Optimization With Applications To Decentralized Distributed Optimization

Aleksandr Beznosikov et al.

IFAC PAPERSONLINE (2020)

Article Computer Science, Software Engineering

An optimal randomized incremental gradient method

Guanghui Lan et al.

MATHEMATICAL PROGRAMMING (2018)

Article Computer Science, Theory & Methods

Random Gradient-Free Minimization of Convex Functions

Yurii Nesterov et al.

FOUNDATIONS OF COMPUTATIONAL MATHEMATICS (2017)

Article Mathematics, Applied

EFFICIENCY OF THE ACCELERATED COORDINATE DESCENT METHOD ON STRUCTURED OPTIMIZATION PROBLEMS

Yurii Nesterov et al.

SIAM JOURNAL ON OPTIMIZATION (2017)

Article Computer Science, Software Engineering

Gradient sliding for composite optimization

Guanghui Lan

MATHEMATICAL PROGRAMMING (2016)

Article Mathematics, Applied

CONDITIONAL GRADIENT SLIDING FOR CONVEX OPTIMIZATION

Guanghui Lan et al.

SIAM JOURNAL ON OPTIMIZATION (2016)

Article Mathematics, Applied

ACCELERATED, PARALLEL, AND PROXIMAL COORDINATE DESCENT

Olivier Fercoq et al.

SIAM JOURNAL ON OPTIMIZATION (2015)

Article Computer Science, Software Engineering

Gradient methods for minimizing composite functions

Yu Nesterov

MATHEMATICAL PROGRAMMING (2013)

Article Mathematics, Applied

AN ACCELERATED HYBRID PROXIMAL EXTRAGRADIENT METHOD FOR CONVEX OPTIMIZATION AND ITS IMPLICATIONS TO SECOND-ORDER METHODS

Renato D. C. Monteiro et al.

SIAM JOURNAL ON OPTIMIZATION (2013)

Article Mathematics, Applied

EFFICIENCY OF COORDINATE DESCENT METHODS ON HUGE-SCALE OPTIMIZATION PROBLEMS

Yu Nesterov

SIAM JOURNAL ON OPTIMIZATION (2012)

Article Computer Science, Software Engineering

Smooth minimization of non-smooth functions

Y Nesterov

MATHEMATICAL PROGRAMMING (2005)