4.5 Article

Approximation of Lipschitz Functions Using Deep Spline Neural Networks*

期刊

出版社

SIAM PUBLICATIONS
DOI: 10.1137/22M1504573

关键词

deep learning; learnable activations; universality; robustness; Lipschitz continuity; linear splines

向作者/读者索取更多资源

Although Lipschitz-constrained neural networks are widely used in machine learning, designing and training expressive Lipschitz-constrained networks is challenging. To overcome the disadvantages of rectified linear-unit networks, we propose using learnable spline activation functions with three linear regions or more. We prove that our choice is universal among all 1-Lipschitz activation functions and can approximate a larger class of functions compared to other weight-constrained architectures. Our choice is also at least as expressive as the non-componentwise Groupsort activation function for spectral-norm-constrained weights. The theoretical findings align with prior numerical results.
Although Lipschitz-constrained neural networks have many applications in machine learning, the design and training of expressive Lipschitz-constrained networks is very challenging. Since the popular rectified linear-unit networks have provable disadvantages in this setting, we propose using learnable spline activation functions with at least three linear regions instead. We prove that our choice is universal among all componentwise 1-Lipschitz activation functions in the sense that no other weight-constrained architecture can approximate a larger class of functions. Additionally, our choice is at least as expressive as the recently introduced non-componentwise Groupsort activation function for spectral-norm-constrained weights. The theoretical findings of this paper are consistent with previously published numerical results.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据