4.5 Article

An exact reformulation algorithm for large nonconvex NLPs involving bilinear terms

期刊

JOURNAL OF GLOBAL OPTIMIZATION
卷 36, 期 2, 页码 161-189

出版社

SPRINGER
DOI: 10.1007/s10898-006-9005-4

关键词

bilinear; convex relaxation; global optimization; NLP; reformulation-linearization technique; RRLT constraints

向作者/读者索取更多资源

Many nonconvex nonlinear programming (NLP) problems of practical interest involve bilinear terms and linear constraints, as well as, potentially, other convex and nonconvex terms and constraints. In such cases, it may be possible to augment the formulation with additional linear constraints (a subset of Reformulation-Linearization Technique constraints) which do not affect the feasible region of the original NLP but tighten that of its convex relaxation to the extent that some bilinear terms may be dropped from the problem formulation. We present an efficient graph-theoretical algorithm for effecting such exact reformulations of large, sparse NLPs. The global solution of the reformulated problem using spatial Branch-and Bound algorithms is usually significantly faster than that of the original NLP. We illustrate this point by applying our algorithm to a set of pooling and blending global optimization problems.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据