4.5 Article

Parallelized maximization of nonmonotone one-sided σ-smooth function

Journal

COMPUTERS & ELECTRICAL ENGINEERING
Volume 105, Issue -, Pages -

Publisher

PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.compeleceng.2022.108478

Keywords

OSS function; Parallel algorithm; Down-closed; Frank Wolfe

Ask authors/readers for more resources

In this paper, the problem of maximizing a nonmonotone nonnegative one-sided sigma-smooth function G(x) is studied. Two classes of parallel algorithms are proposed for a downwards-closed basis polyhedron constraint under deterministic and random settings. For the deterministic nonmonotone OSS problem, the Jump-Start Parallel Frank Wolfe (JSPFW) algorithm is designed to obtain an approximation solution. For the stochastic OSS problem, the Stochastic Parallel Frank Wolfe (SPFW) algorithm is designed to get an approximation solution.
In this paper, we study the problem of maximizing a nonmonotone nonnegative one-sided sigma-smooth (OSS) function G(x). For a downwards-closed basis polyhedron constraint, we propose two classes of parallel algorithms under deterministic and random settings. For the deterministic nonmonotone OSS problem, we design the Jump-Start Parallel Frank Wolfe (JSPFW) algorithm and obtain an approximation solution of (e(-1) - e(-2) - O(epsilon)). The JSPFW algorithm has (O(log n/epsilon(2))) adaptive rounds and O(n log n/epsilon(2)) queries about function and its gradient value. For the stochastic OSS problem, we design the Stochastic Parallel Frank Wolfe (SPFW) algorithm and get the (e(-1) - e(-2))(1 - O(epsilon))(1 - o(1)) approximation solution. The SPFW algorithm needs O(n(2)) adaptive rounds and consumes O(n(2)) queries about gradient function value.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available