4.6 Article

Efficient differentiable quadratic programming layers: an ADMM approach

Journal

COMPUTATIONAL OPTIMIZATION AND APPLICATIONS
Volume 84, Issue 2, Pages 449-476

Publisher

SPRINGER
DOI: 10.1007/s10589-022-00422-7

Keywords

Data driven stochastic-programming; Differentiable neural networks; Quadratic programming; ADMM

Ask authors/readers for more resources

This paper presents an alternative network layer architecture based on ADMM for solving medium-sized quadratic programs, which demonstrates computational advantages and efficiency compared to state-of-the-art layers.
Recent advances in neural-network architecture allow for seamless integration of convex optimization problems as differentiable layers in an end-to-end trainable neural network. Integrating medium and large scale quadratic programs into a deep neural network architecture, however, is challenging as solving quadratic programs exactly by interior-point methods has worst-case cubic complexity in the number of variables. In this paper, we present an alternative network layer architecture based on the alternating direction method of multipliers (ADMM) that is capable of scaling to moderate sized problems with 100-1000 decision variables and thousands of training examples. Backward differentiation is performed by implicit differentiation of a customized fixed-point iteration. Simulated results demonstrate the computational advantage of the ADMM layer, which for medium scale problems is approximately an order of magnitude faster than the state-of-the-art layers. Furthermore, our novel backward-pass routine is computationally efficient in comparison to the standard approach based on unrolled differentiation or implicit differentiation of the KKT optimality conditions. We conclude with examples from portfolio optimization in the integrated prediction and optimization paradigm.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available