4.5 Article

A Generalization Gap Estimation for Overparameterized Models via the Langevin Functional Variance

Related references

Note: Only part of the references are listed.
Article Computer Science, Hardware & Architecture

Understanding Deep Learning (Still) Requires Rethinking Generalization

Chiyuan Zhang et al.

Summary: Despite traditional explanations falling short in justifying the excellent generalization of large neural networks, experiments show that state-of-the-art convolutional networks can easily adapt to random labeling during training, indicating a different mechanism contributing to their strong performance in practice.

COMMUNICATIONS OF THE ACM (2021)

Article Multidisciplinary Sciences

Benign overfitting in linear regression

Peter L. Bartlett et al.

PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA (2020)

Article Mathematics, Applied

Two Models of Double Descent for Weak Features

Mikhail Belkin et al.

SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE (2020)

Review Multidisciplinary Sciences

Deep learning

Yann LeCun et al.

NATURE (2015)

Article Computer Science, Artificial Intelligence

Monte-Carlo Sure: A black-box optimization of regularization parameters for general denoising algorithms

Sathish Ramani et al.

IEEE TRANSACTIONS ON IMAGE PROCESSING (2008)

Article Statistics & Probability

Bayesian measures of model complexity and fit

DJ Spiegelhalter et al.

JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY (2002)