4.4 Article

A combined brain-computer interface based on P300 potentials and motion-onset visual evoked potentials

Journal

JOURNAL OF NEUROSCIENCE METHODS
Volume 205, Issue 2, Pages 265-276

Publisher

ELSEVIER
DOI: 10.1016/j.jneumeth.2012.01.004

Keywords

Event-related potential; P300 potential; Motion-onset visual evoked potentials; Brain-computer interface; Changing stimuli; Adaptive

Funding

  1. National Natural Science Foundation of China [61074113]
  2. Shanghai Leading Academic Discipline Project [B504]
  3. European Commission [ICT-2010-247447]
  4. Fundamental Research Funds for the Central Universities [WH1114038, WH0914028]

Ask authors/readers for more resources

Brain-computer interfaces (BCIs) allow users to communicate via brain activity alone. Many BCIs rely on the P300 and other event-related potentials (ERPs) that are elicited when target stimuli flash. Although there have been considerable research exploring ways to improve P300 BCIs, surprisingly little work has focused on new ways to change visual stimuli to elicit more recognizable ERPs. In this paper, we introduce a combined BCI based on P300 potentials and motion-onset visual evoked potentials (M-VEPs) and compare it with BCIs based on each simple approach (P300 and M-VEP). Offline data suggested that performance would be best in the combined paradigm. Online tests with adaptive BCIs confirmed that our combined approach is practical in an online BCI, and yielded better performance than the other two approaches P < 0.05) without annoying or overburdening the subject. The highest mean classification accuracy (96%) and practical bit rate (26.7 bit/s) were obtained from the combined condition. (C) 2012 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.4
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available