4.6 Article

CONSTRAINED, GLOBAL OPTIMIZATION OF UNKNOWN FUNCTIONS WITH LIPSCHITZ CONTINUOUS GRADIENTS

Journal

SIAM JOURNAL ON OPTIMIZATION
Volume 32, Issue 2, Pages 1239-1264

Publisher

SIAM PUBLICATIONS
DOI: 10.1137/20M1380879

Keywords

constrained; global optimization; Lipschitz continuous gradients; covering methods; nonconvex optimization

Funding

  1. Air Force Office of Scientific Research [FA9550-19-1-0005]
  2. National Aeronautics and Space Administration [80NSSC19K0209]

Ask authors/readers for more resources

This paper presents two first-order, sequential optimization algorithms for solving constrained optimization problems. The algorithms balance the exploration of the unknown feasible space with the pursuit of global optimality within a finite number of oracle calls. The first algorithm accommodates an infeasible start and provides either a near-optimal global solution or establishes infeasibility, while the second algorithm returns a near-optimal global solution without any constraint violation for a strongly convex constraint function and a feasible initial guess. These algorithms compute global suboptimality bounds at each iteration and can satisfy user-specified tolerances in the computed solution.
We present two first-order, sequential optimization algorithms to solve constrained optimization problems. We consider a black-box setting with a priori unknown, nonconvex objective and constraint functions that have Lipschitz continuous gradients. The proposed algorithms balance the exploration of the a priori unknown feasible space with the pursuit of global optimality within a prespecified finite number of first-order oracle calls. The first algorithm accommodates an infeasible start and provides either a near-optimal global solution or establishes infeasibility. However, the algorithm may produce infeasible iterates during the search. For a strongly convex constraint function and a feasible initial solution guess, the second algorithm returns a near-optimal global solution without any constraint violation. At each iteration, the algorithms identify the next query point by solving a nonconvex, quadratically constrained, quadratic program, characterized by the Lipschitz continuous gradient property of the functions and the oracle responses. In contrast to existing methods, the algorithms compute global suboptimality bounds at every iteration. They can also satisfy user-specified tolerances in the computed solution with near-optimal complexity in oracle calls for a large class of optimization problems. We implement the proposed algorithms using GUROBI, a commercial off-the-shelf solver.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available