Journal
JOURNAL OF MULTIVARIATE ANALYSIS
Volume 101, Issue 7, Pages 1574-1593Publisher
ELSEVIER INC
DOI: 10.1016/j.jmva.2010.02.009
Keywords
A diverging number of parameters; Exponential family; Hemodynamic response function; Loss function; Optimal Bayes rule
Categories
Funding
- National Science Foundation
Ask authors/readers for more resources
Stochastic modeling for large-scale datasets usually involves a varying-dimensional model space. This paper investigates the asymptotic properties, when the number of parameters grows with the available sample size, of the minimum-BD estimators and classifiers under a broad and important class of Bregman divergence (BD), which encompasses nearly all of the commonly used loss functions in the regression analysis, classification procedures and machine learning literature. Unlike the maximum likelihood estimators which require the joint likelihood of observations, the minimum-BD estimators are useful for a range of models where the joint likelihood is unavailable or incomplete. Statistical inference tools developed for the class of large dimensional minimum-BD estimators and related classifiers are evaluated via simulation studies, and are illustrated by analysis of a real dataset. (C) 2010 Elsevier Inc. All rights reserved.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available