4.5 Article

An exact reformulation algorithm for large nonconvex NLPs involving bilinear terms

Journal

JOURNAL OF GLOBAL OPTIMIZATION
Volume 36, Issue 2, Pages 161-189

Publisher

SPRINGER
DOI: 10.1007/s10898-006-9005-4

Keywords

bilinear; convex relaxation; global optimization; NLP; reformulation-linearization technique; RRLT constraints

Ask authors/readers for more resources

Many nonconvex nonlinear programming (NLP) problems of practical interest involve bilinear terms and linear constraints, as well as, potentially, other convex and nonconvex terms and constraints. In such cases, it may be possible to augment the formulation with additional linear constraints (a subset of Reformulation-Linearization Technique constraints) which do not affect the feasible region of the original NLP but tighten that of its convex relaxation to the extent that some bilinear terms may be dropped from the problem formulation. We present an efficient graph-theoretical algorithm for effecting such exact reformulations of large, sparse NLPs. The global solution of the reformulated problem using spatial Branch-and Bound algorithms is usually significantly faster than that of the original NLP. We illustrate this point by applying our algorithm to a set of pooling and blending global optimization problems.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available