4.7 Article

The quantitative evaluation of functional neuroimaging experiments: Mutual information learning curves

期刊

NEUROIMAGE
卷 15, 期 4, 页码 772-786

出版社

ACADEMIC PRESS INC ELSEVIER SCIENCE
DOI: 10.1006/nimg.2001.1033

关键词

learning curve; multisubject PET and fMRI studies; macroscopic and microscopic models; generalization error; prediction error; mutual information; cross-validation; sensitivity map

向作者/读者索取更多资源

Learning curves are presented as an unbiased means for evaluating the performance of models for neuroimaging data analysis. The learning curve measures the predictive performance in terms of the generalization or prediction error as a function of the number of independent examples (e.g., subjects) used to determine the parameters in the model. Cross-validation resampling is used to obtain unbiased estimates of a generic multivariate Gaussian classifier, for training set sizes from 2 to 16 subjects. We apply the framework to four different activation experiments, in this case [O-15]water data sets, although the framework is equally valid for multisubject fMRI studies. We demonstrate how the prediction error can be expressed as the mutual information between the scan and the scan label, measured in units of bits. The mutual information learning curve can be used to evaluate the impact of different methodological choices, e.g., classification label schemes, preprocessing choices. Another application for the learning curve is to examine the model performance using bias/variance considerations enabling the researcher to determine if the model performance is limited by statistical bias or variance. We furthermore present the sensitivity map as a general method for extracting activation maps from statistical models within the probabilistic framework and illustrate relationships between mutual information and pattern reproducibility as derived in the NPAIRS framework described in a companion paper. (C) 2002 Elsevier Science (USA).

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据