4.4 Article

Statistical inference of minimum BD estimators and classifiers for varying-dimensional models

期刊

JOURNAL OF MULTIVARIATE ANALYSIS
卷 101, 期 7, 页码 1574-1593

出版社

ELSEVIER INC
DOI: 10.1016/j.jmva.2010.02.009

关键词

A diverging number of parameters; Exponential family; Hemodynamic response function; Loss function; Optimal Bayes rule

资金

  1. National Science Foundation

向作者/读者索取更多资源

Stochastic modeling for large-scale datasets usually involves a varying-dimensional model space. This paper investigates the asymptotic properties, when the number of parameters grows with the available sample size, of the minimum-BD estimators and classifiers under a broad and important class of Bregman divergence (BD), which encompasses nearly all of the commonly used loss functions in the regression analysis, classification procedures and machine learning literature. Unlike the maximum likelihood estimators which require the joint likelihood of observations, the minimum-BD estimators are useful for a range of models where the joint likelihood is unavailable or incomplete. Statistical inference tools developed for the class of large dimensional minimum-BD estimators and related classifiers are evaluated via simulation studies, and are illustrated by analysis of a real dataset. (C) 2010 Elsevier Inc. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.4
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据