4.6 Article

Fast and accurate decoding of finger movements from ECoG through Riemannian features and modern machine learning techniques

期刊

JOURNAL OF NEURAL ENGINEERING
卷 19, 期 1, 页码 -

出版社

IOP Publishing Ltd
DOI: 10.1088/1741-2552/ac4ed1

关键词

electrocorticography (ECoG); motor decoding; brain-machine interface (BMI); machine learning; Riemannian geometry; feature extraction

资金

  1. National Key R&D Program of China [2018YFA0701400]
  2. Key R&D Program of Zhejiang [2022C03011]
  3. Chuanqi Research and Development Center of Zhejiang University
  4. Starry Night Science Fund of Zhejiang University Shanghai Institute for Advanced Study [SN-ZJU-SIAS-002]
  5. Fundamental Research Funds for the Central Universities [2021KYY600403-0001]
  6. Research Project of State Key Laboratory of Mechanical System and Vibration [MSV202115]
  7. EPFL
  8. Cornell University

向作者/读者索取更多资源

This work introduces the use of Riemannian-space features and temporal dynamics of electrocorticography (ECoG) signal combined with modern machine learning tools to improve the decoding accuracy of individual finger movements. By selecting informative biomarkers and exploring the concatenation of features, the proposed method achieved significant improvements in both classification and regression tasks. The results show that the approach outperformed previous methods in detecting individual finger movements and continuous decoding of movement trajectory, with a low time complexity.
Objective. Accurate decoding of individual finger movements is crucial for advanced prosthetic control. In this work, we introduce the use of Riemannian-space features and temporal dynamics of electrocorticography (ECoG) signal combined with modern machine learning (ML) tools to improve the motor decoding accuracy at the level of individual fingers. Approach. We selected a set of informative biomarkers that correlated with finger movements and evaluated the performance of state-of-the-art ML algorithms on the brain-computer interface (BCI) competition IV dataset (ECoG, three subjects) and a second ECoG dataset with a similar recording paradigm (Stanford, nine subjects). We further explored the temporal concatenation of features to effectively capture the history of ECoG signal, which led to a significant improvement over single-epoch decoding in both classification (p < 0.01) and regression tasks (p < 0.01). Main results. Using feature concatenation and gradient boosted trees (the top-performing model), we achieved a classification accuracy of 77.0% in detecting individual finger movements (six-class task, including rest state), improving over the state-of-the-art conditional random fields by 11.7% on the three BCI competition subjects. In continuous decoding of movement trajectory, our approach resulted in an average Pearson's correlation coefficient (r) of 0.537 across subjects and fingers, outperforming both the BCI competition winner and the state-of-the-art approach reported on the same dataset (CNN + LSTM). Furthermore, our proposed method features a low time complexity, with only <17.2 s required for training and <50 ms for inference. This enables about 250 x speed-up in training compared to previously reported deep learning method with state-of-the-art performance. Significance. The proposed techniques enable fast, reliable, and high-performance prosthetic control through minimally-invasive cortical signals.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据