4.6 Article

How much baseline correction do we need in ERP research? Extended GLM model can replace baseline correction while lifting its limits

期刊

PSYCHOPHYSIOLOGY
卷 56, 期 12, 页码 -

出版社

WILEY
DOI: 10.1111/psyp.13451

关键词

analysis; statistical methods; EEG; ERPs; oscillation; time frequency analyses

向作者/读者索取更多资源

Baseline correction plays an important role in past and current methodological debates in ERP research (e.g., the Tanner vs. Maess debate in the Journal of Neuroscience Methods), serving as a potential alternative to strong high-pass filtering. However, the very assumptions that underlie traditional baseline also undermine it, implying a reduction in the signal-to-noise ratio. In other words, traditional baseline correction is statistically unnecessary and even undesirable. Including the baseline interval as a predictor in a GLM-based statistical approach allows the data to determine how much baseline correction is needed, including both full traditional and no baseline correction as special cases. This reduces the amount of variance in the residual error term and thus has the potential to increase statistical power.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据