4.6 Article

SURPRISES IN HIGH-DIMENSIONAL RIDGELESS LEAST SQUARES INTERPOLATION

相关参考文献

注意:仅列出部分参考文献,下载原文获取全部文献信息。
Article Mechanics

Generalisation error in learning with random features and the hidden manifold model*

Federica Gerace et al.

Summary: In this study, we focus on generalised linear regression and classification for a synthetically generated dataset, presenting closed-form expressions for asymptotic generalisation performance using the replica method from statistical physics. We highlight the double descent behavior in logistic regression and the superiority of orthogonal projections in learning with random features, while considering the role of correlations in data generated by the hidden manifold model. This theoretical formalism not only addresses specific problems but also opens a pathway for extending to more complex tasks.

JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT (2021)

Article Computer Science, Information Systems

Consistent Risk Estimation in Moderately High-Dimensional Linear Regression

Ji Xu et al.

Summary: This paper studies the problem of risk estimation under the moderately high-dimensional asymptotic setting and proves the consistency of three risk estimators that have been successful in numerical studies.

IEEE TRANSACTIONS ON INFORMATION THEORY (2021)

Article Statistics & Probability

THE DISTRIBUTION OF THE LASSO: UNIFORM CONTROL OVER SPARSE BALLS AND ADAPTIVE PARAMETER TUNING

Leo Miolane et al.

Summary: The Lasso is a popular regression method for high-dimensional problems, with statistical properties related to soft-thresholding denoisers. The method can be used to evaluate the performance of various data-driven procedures and has been shown to be effective in dealing with Gaussian noise.

ANNALS OF STATISTICS (2021)

Article Mathematics

Deep learning: a statistical viewpoint

Peter L. Bartlett et al.

Summary: The successful practical application of deep learning has raised theoretical surprises, suggesting that overparameterization allows for interpolating solutions, implicit regularization, and benign overfitting. By examining simple settings, the principles underlying these phenomena are illustrated, particularly in regression problems with quadratic loss, providing insights into the behavior of deep learning methods.

ACTA NUMERICA (2021)

Article Mechanics

Scaling description of generalization with number of parameters in deep learning

Mario Geiger et al.

JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT (2020)

Article Multidisciplinary Sciences

Benign overfitting in linear regression

Peter L. Bartlett et al.

PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA (2020)

Article Statistics & Probability

JUST INTERPOLATE: KERNEL RIDGELESS REGRESSION CAN GENERALIZE

Tengyuan Liang et al.

ANNALS OF STATISTICS (2020)

Article Mechanics

Wide neural networks of any depth evolve as linear models under gradient descent*

Jaehoon Lee et al.

JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT (2020)

Article Mathematics, Applied

Two Models of Double Descent for Weak Features

Mikhail Belkin et al.

SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE (2020)

Article Mathematics, Applied

MEAN FIELD ANALYSIS OF NEURAL NETWORKS: A LAW OF LARGE NUMBERS

Justin Sirignano et al.

SIAM JOURNAL ON APPLIED MATHEMATICS (2020)

Article Statistics & Probability

The spectral norm of random inner-product kernel matrices

Zhou Fan et al.

PROBABILITY THEORY AND RELATED FIELDS (2019)

Article Multidisciplinary Sciences

Reconciling modern machine-learning practice and the classical bias-variance trade-off

Mikhail Belkin et al.

PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA (2019)

Article Physics, Multidisciplinary

A jamming transition from under- to over-parametrization affects generalization in deep learning

S. Spigler et al.

JOURNAL OF PHYSICS A-MATHEMATICAL AND THEORETICAL (2019)

Article Statistics & Probability

HIGH-DIMENSIONAL ASYMPTOTICS OF PREDICTION: RIDGE REGRESSION AND CLASSIFICATION

Edgar Dobriban et al.

ANNALS OF STATISTICS (2018)

Article Multidisciplinary Sciences

A mean field view of the landscape of two-layer neural networks

Song Mei et al.

PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA (2018)

Article Statistics & Probability

Anisotropic local laws for random matrices

Antti Knowles et al.

PROBABILITY THEORY AND RELATED FIELDS (2017)

Article Physics, Mathematical

THE SPECTRUM OF RANDOM INNER-PRODUCT KERNEL MATRICES

Xiuyuan Cheng et al.

RANDOM MATRICES-THEORY AND APPLICATIONS (2013)

Article Statistics & Probability

Eigenvectors of some large sample covariance matrix ensembles

Olivier Ledoit et al.

PROBABILITY THEORY AND RELATED FIELDS (2011)

Article Statistics & Probability

Spectral convergence for a general class of random matrices

Francisco Rubio et al.

STATISTICS & PROBABILITY LETTERS (2011)

Article Statistics & Probability

THE SPECTRUM OF KERNEL RANDOM MATRICES

Noureddine El Karoui

ANNALS OF STATISTICS (2010)