Journal
IEEE TRANSACTIONS ON INFORMATION THEORY
Volume 67, Issue 4, Pages 2539-2553Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TIT.2021.3050469
Keywords
Goodness-of-fit test; manifolds; nested model selection; sequential test
Funding
- National Science Foundation (NSF) [1633196]
- NSF [CCF-1442635, DMS-1938106, DMS-1830210]
- NSF CAREER Award [CCF-1650913]
Ask authors/readers for more resources
The research develops a general theory for the goodness-of-fit test to non-linear models, where the residual of the model fit follows a chi(2) distribution related to the model order and problem dimension. A sequential method for selecting model orders is presented, demonstrating broad applications in machine learning and signal processing.
We develop a general theory for the goodness-of-fit test to non-linear models. In particular, we assume that the observations are noisy samples of a submanifold defined by a sufficiently smooth non-linear map. The observation noise is additive Gaussian. Our main result shows that the residual of the model fit, by solving a non-linear least-square problem, follows a (possibly noncentral) chi(2) distribution. The parameters of the chi(2) distribution are related to the model order and dimension of the problem. We further present a method to select the model orders sequentially. We demonstrate the broad application of the general theory in machine learning and signal processing, including determining the rank of low-rank (possibly complexvalued) matrices and tensors from noisy, partial, or indirect observations, determining the number of sources in signal demixing, and potential applications in determining the number of hidden nodes in neural networks.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available