4.6 Article

HIGHER-ORDER METHODS FOR CONVEX-CONCAVE MIN-MAX OPTIMIZATION AND MONOTONE VARIATIONAL INEQUALITIES

Journal

SIAM JOURNAL ON OPTIMIZATION
Volume 32, Issue 3, Pages 2208-2229

Publisher

SIAM PUBLICATIONS
DOI: 10.1137/21M1395764

Keywords

variational inequalities; min-max; convex-concave; saddle point problem; zero-sum game; higher-order optimization

Ask authors/readers for more resources

This study presents improved convergence rates for constrained convex-concave min-max problems and monotone variational inequalities with higher-order smoothness. By proposing an algorithm, higher iteration complexity can be achieved in cases where higher-order derivatives are Lipschitz continuous, improving upon existing methods.
We provide improved convergence rates for constrained convex-concave min-max problems and monotone variational inequalities with higher-order smoothness. In min-max settings where the pth-order derivatives are Lipschitz continuous, we give an algorithm that achieves an iteration complexity of O(1/Tp+1/2) when given access to an oracle for finding a fixed point of a pth-order equation. We give analogous rates for the weak monotone variational inequality problem. For p > 2, our results improve upon the iteration complexity of the first-order Mirror Prox method by Nemirovski [SIAM J. Optim., 15 (2004), pp. 229-251] and the second-order method by Monteiro and Svaiter [SIAM J. Optim., 22 (2012), pp. 914-935]. We further instantiate our entire algorithm in the unconstrained p = 2 case.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available