4.2 Article

Continuous Newton-like Inertial Dynamics for Monotone Inclusions

Journal

SET-VALUED AND VARIATIONAL ANALYSIS
Volume 29, Issue 3, Pages 555-581

Publisher

SPRINGER
DOI: 10.1007/s11228-020-00564-y

Keywords

Damped inertial dynamics; Hessian damping; Maximally monotone operators; Newton method; Vanishing viscosity; Yosida regularization

Ask authors/readers for more resources

In this study, we investigate the convergence properties of a Newton-like inertial dynamical system in a Hilbert framework, where the oscillations are significantly attenuated by introducing Hessian-driven damping. By replacing the maximally monotone operator with its Yosida approximation and introducing a Newton-like correction term, we obtain a well-posed evolution system with weak convergence towards the zeroes of the operator. Specializing to the case where the operator is the subdifferential of a convex lower semicontinuous function results in fast optimization outcomes.
In a Hilbert framework H we study the convergence properties of a Newton-like inertial dynamical system governed by a general maximally monotone operatorAH: -> 2(H;). WhenAis equal to the subdifferential of a convex lower semicontinuous proper function, the dynamic corresponds to the introduction of the Hessian-driven damping in the continuous version of the accelerated gradient method of Nesterov. As a result, the oscillations are significantly attenuated. According to the technique introduced by Attouch-Peypouquet (Math. Prog. 2019), the maximally monotone operator is replaced by its Yosida approximation with an appropriate adjustment of the regularization parameter. The introduction into the dynamic of the Newton-like correction term (corresponding to the Hessian driven term in the case of convex minimization) provides a well-posed evolution system for which we will obtain the weak convergence of the generated trajectories towards the zeroes ofA. We also obtain the fast convergence of the velocities towards zero. The results tolerate the presence of errors, perturbations. Then, we specialize our results to the case where the operatorAis the subdifferential of a convex lower semicontinuous function, and obtain fast optimization results.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.2
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available