Journal
MATHEMATICAL PROGRAMMING
Volume 192, Issue 1-2, Pages 89-118Publisher
SPRINGER HEIDELBERG
DOI: 10.1007/s10107-021-01691-6
Keywords
Surrogate relaxation; MINLP; Nonconvex optimization
Categories
Funding
- Research Campus MODAL (BMBF) [05M14ZAM, 05M20ZBM]
- Institute for Data Valorization (IVADO)
Ask authors/readers for more resources
This work explores the use of a nonconvex relaxation obtained through constraint aggregation for solving MINLPs, highlighting the computational advantages and challenges.
Themost important ingredient for solving mixed-integer nonlinear programs (MINLPs) to global epsilon-optimality with spatial branch and bound is a tight, computationally tractable relaxation. Due to both theoretical and practical considerations, relaxations of MINLPs are usually required to be convex. Nonetheless, current optimization solvers can often successfully handle a moderate presence of nonconvexities, which opens the door for the use of potentially tighter nonconvex relaxations. In this work, we exploit this fact and make use of a nonconvex relaxation obtained via aggregation of constraints: a surrogate relaxation. These relaxations were actively studied for linear integer programs in the 70s and 80s, but they have been scarcely considered since. We revisit these relaxations in an MINLP setting and show the computational benefits and challenges they can have. Additionally, we study a generalization of such relaxation that allows for multiple aggregations simultaneously and present the first algorithm that is capable of computing the best set of aggregations. We propose a multitude of computational enhancements for improving its practical performance and evaluate the algorithm's ability to generate strong dual bounds through extensive computational experiments.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available