期刊
COMPUTERS & ELECTRICAL ENGINEERING
卷 105, 期 -, 页码 -出版社
PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.compeleceng.2022.108478
关键词
OSS function; Parallel algorithm; Down-closed; Frank Wolfe
In this paper, the problem of maximizing a nonmonotone nonnegative one-sided sigma-smooth function G(x) is studied. Two classes of parallel algorithms are proposed for a downwards-closed basis polyhedron constraint under deterministic and random settings. For the deterministic nonmonotone OSS problem, the Jump-Start Parallel Frank Wolfe (JSPFW) algorithm is designed to obtain an approximation solution. For the stochastic OSS problem, the Stochastic Parallel Frank Wolfe (SPFW) algorithm is designed to get an approximation solution.
In this paper, we study the problem of maximizing a nonmonotone nonnegative one-sided sigma-smooth (OSS) function G(x). For a downwards-closed basis polyhedron constraint, we propose two classes of parallel algorithms under deterministic and random settings. For the deterministic nonmonotone OSS problem, we design the Jump-Start Parallel Frank Wolfe (JSPFW) algorithm and obtain an approximation solution of (e(-1) - e(-2) - O(epsilon)). The JSPFW algorithm has (O(log n/epsilon(2))) adaptive rounds and O(n log n/epsilon(2)) queries about function and its gradient value. For the stochastic OSS problem, we design the Stochastic Parallel Frank Wolfe (SPFW) algorithm and get the (e(-1) - e(-2))(1 - O(epsilon))(1 - o(1)) approximation solution. The SPFW algorithm needs O(n(2)) adaptive rounds and consumes O(n(2)) queries about gradient function value.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据