4.7 Article

Use Coupled LSTM Networks to Solve Constrained Optimization Problems

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TCCN.2022.3228584

Keywords

Optimization; Training; Iterative methods; Government; Unsupervised learning; Supervised learning; Robustness; Optimization method; resource management; neural networks; iterative methods

Ask authors/readers for more resources

Gradient-based iterative algorithms are widely used for optimization problems. This study proposes a learning approach using Coupled Long Short-Term Memory networks (CLSTMs) to quickly generate optimal solutions for constrained optimization problems with varying system parameters. The advantages of this approach include obtaining approximate near-optimal solutions in a few iterations and the ability to generate solutions with different parameter distributions during training. Numerical experiments using Alibaba datasets show that the CLSTMs reach within 90% or better of the corresponding optimum after 11 iterations, with reduced iteration and CPU time compared to gradient descent with momentum.
Gradient-based iterative algorithms have been widely used to solve optimization problems, including resource sharing and network management. When system parameters change, it requires a new solution independent of the previous parameter settings from the iterative methods. Therefore, we propose a learning approach that can quickly produce optimal solutions over a range of system parameters for constrained optimization problems. Two Coupled Long Short-Term Memory networks (CLSTMs) are proposed to find the optimal solution. The advantages of this framework include: (1) near-optimal solution for a given problem instance can be obtained in few iterations during the inference, (2) enhanced robustness as the CLSTMs can be trained using system parameters with distributions different from those used during inference to generate solutions. In this work, we analyze the relationship between minimizing the loss functions and solving the original constrained optimization problem for certain parameter settings. Extensive numerical experiments using datasets from Alibaba reveal that the solutions to a set of nonconvex optimization problems obtained by the CLSTMs reach within 90% or better of the corresponding optimum after 11 iterations, where the number of iterations and CPU time consumption are reduced by 81% and 33%, respectively, when compared with the gradient descent with momentum.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available