|
|
IncepA-EEGNet: P300 signal detection method based on fusion of Inception network and attention mechanism |
Meng XU1( ),Dan WANG1,*( ),Zhi-yuan LI1,Yuan-fang CHEN2 |
1. Faculty of Information Technology, Beijing University of Technology, Beijing 100124, China 2. Beijing Institute of Machinery and Equipment, Beijing 100039, China |
|
|
Abstract A novel EEGNet variation based on the fusion of the Inception and attention mechanism modules was proposed, called IncepA-EEGNet, in order to achieve more efficient P300 signal feature extraction. Convolutional layers with different receptive fields were connected in parallel. The network’s ability to extract and express EEG signals were enhanced. Then the attention mechanism was introduced to assign weights to the features of different filters, and important information was extracted from the P300 signal. The validation experiment was conducted on two subjects of BCI Competition III dataset II. Results showed that the IncepA-EEGNet recognition accuracy reached 75.5% after just 5 epochs compared with other deep learning models. The information transmission rate was up to 33.44 bits/min on subject B after 3 epochs. These experimental results demonstrate that the IncepA-EEGNet effectively improves the recognition accuracy of the P300 signal, reduces the time of repeated trials, and enhances the applicability of the P300 speller.
|
Received: 30 July 2021
Published: 24 April 2022
|
|
Fund: 国家自然科学基金资助项目(61672505) |
Corresponding Authors:
Dan WANG
E-mail: xumeng@emails.bjut.edu.cn;wangdan@bjut.edu.cn
|
IncepA-EEGNet: 融合Inception网络和注意力机制的P300信号检测方法
为了实现更高效的P300信号特征提取,提出融合Inception网络和注意力机制模块的卷积网络模型,即IncepA-EEGNet. 该模型使用不同感受野的卷积层进行并行连接,增强网络提取和表达脑电信号的能力. 引入注意力机制实现不同过滤器特征的权重分配,提取P300信号中的重要信息. 模型在BCI Competition III数据集II的2个受试者数据上进行验证. 与其他深度学习模型相比,IncepA-EEGNet的字符识别率在5个实验轮次后达到平均75.5%,在3个轮次后受试者B的信息传输速率达到33.44 bit/min. 实验结果表明,IncepA-EEGNet有效提高了P300信号的识别精度,减少了重复试验的时间,改善了P300拼写器的实用性.
关键词:
注意力机制,
Inception网络,
EEGNet,
P300检测,
字符拼写
|
|
[1] |
IKEGAMI S, TAKANO K, SAEKI N, et al Operation of a P300-based brain–computer interface by individuals with cervical spinal cord injury[J]. Clinical Neurophysiology, 2011, 122 (5): 991- 996
doi: 10.1016/j.clinph.2010.08.021
|
|
|
[2] |
RICCIO A, SCHETTINI F, SIMIONE L, et al On the relationship between attention processing and P300-based brain computer interface control in amyotrophic lateral sclerosis[J]. Frontiers in Human Neuroscience, 2018, 12 (165): 1- 10
|
|
|
[3] |
中国信通院. 脑机接口技术在医疗健康领域应用白皮书[EB/OL]. [2021-07-01]. http://www.caict.ac.cn/kxyj/qwfb/ztbg/202107/t20210715_380509.htm
|
|
|
[4] |
FARWELL L A, DONCHIN E Talking off the top of your head: toward a mental prosthesis utilizing event-related brain potentials[J]. Electroencephalography and Clinical Neurophysiology, 1988, 70 (6): 510- 523
doi: 10.1016/0013-4694(88)90149-6
|
|
|
[5] |
黄育娇, 顾正晖 基于 P300 的交互式字符输入脑机接口系统[J]. 计算机工程与设计, 2014, 35 (4): 1385- 1389 HUANG Yu-jiao, GU Zheng-hui P300-based interactive character input brain-computer interface system[J]. Computer Engineering and Design, 2014, 35 (4): 1385- 1389
doi: 10.3969/j.issn.1000-7024.2014.04.051
|
|
|
[6] |
张楠楠. 脑机接口中的视觉刺激设计与优化方法研究[D]. 长沙: 国防科技大学, 2019: 84. ZHANG Nan-nan. Research on the design and optimization method of visual stimuli in brain-computer interface [D]. Changsha: National University of Defense Technology, 2019: 84.
|
|
|
[7] |
XU M, CHEN L, ZHANG L, et al A visual parallel-BCI speller based on the time–frequency coding strategy[J]. Journal of Neural Engineering, 2014, 11 (2): 026014.1- 026014.11
|
|
|
[8] |
RAKOTOMAMONJY A, GUIGUE V BCI competition III: dataset II-ensemble of SVMs for BCI P300 speller[J]. IEEE Transactions on Biomedical Engineering, 2008, 55 (3): 1147- 1154
doi: 10.1109/TBME.2008.915728
|
|
|
[9] |
KRUSIENSKI D J, SELLERS E W, CABESTAING F, et al A comparison of classification techniques for the P300 speller[J]. Journal of Neural Engineering, 2006, 3 (4): 299.1- 299.13
|
|
|
[10] |
CECOTTI H, GRSER A Convolutional neural networks for p300 detection with application to brain computer interfaces[J]. IEEE Transactions on Software Engineering, 2011, 33 (3): 433- 445
|
|
|
[11] |
ZHANG C, KIM Y K, ESKANDARIAN A EEG-inception: an accurate and robust end-to-end neural network for EEG-based motor imagery classification[J]. Journal of Neural Engineering, 2021, 18 (4): 046014.1- 046014.16
|
|
|
[12] |
AMIN S U, ALTAHERI H, MUHAMMAD G, et al. Attention based Inception model for robust EEG motor imagery classification [C]// 2021 IEEE International Instrumentation and Measurement Technology Conference. Glasgow: IEEE, 2021: 1-6.
|
|
|
[13] |
LI Y, LIU Y, CUI W G, et al Epileptic seizure detection in EEG signals using a unified temporal-spectral squeeze-and-excitation network[J]. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 2020, 28 (4): 782- 794
doi: 10.1109/TNSRE.2020.2973434
|
|
|
[14] |
贾子钰, 林友芳, 刘天航, 等 基于多尺度特征提取与挤压激励模型的运动想象分类方法[J]. 计算机研究与发展, 2020, 57 (12): 2481- 2489 JIA Zi-yu, LIN You-fang, LIU Tian-hang, et al Motor imagery classification based on multiscale feature extraction and squeeze-excitation model[J]. Journal of Computer Research and Development, 2020, 57 (12): 2481- 2489
doi: 10.7544/issn1000-1239.2020.20200723
|
|
|
[15] |
RIVET B, SOULOUMIAC A, ATTINA V, et al xDAWN algorithm to enhance evoked potentials: application to brain–computer interface[J]. IEEE Transactions on Biomedical Engineering, 2009, 56 (8): 2035- 2043
doi: 10.1109/TBME.2009.2012869
|
|
|
[16] |
XIAO X, XU M, JIN J, et al Discriminative canonical pattern matching for single-trial classification of ERP components[J]. IEEE Transactions on Biomedical Engineering, 2019, 67 (8): 2266- 2275
|
|
|
[17] |
LIU M, WU W, GU Z, et al Deep learning based on batch normalization for P300 signal detection[J]. Neurocomputing, 2018, 275: 288- 297
doi: 10.1016/j.neucom.2017.08.039
|
|
|
[18] |
LIU X, XIE Q, LV J, et al P300 event-related potential detection using one-dimensional convolutional capsule networks[J]. Expert Systems with Applications, 2021, 174 (15): 114701.1- 114701.12
|
|
|
[19] |
KUNDU S, ARI S P300 based character recognition using convolutional neural network and support vector machine[J]. Biomedical Signal Processing and Control, 2020, 55: 101645.1- 101645.7
|
|
|
[20] |
LAWHERN V J, SOLON A J, WAYTOWICH N R, et al EEGNet: a compact convolutional network for EEG-based brain-computer interfaces[J]. Journal of Neural Engineering, 2016, 15 (5): 056013.1- 056013.17
|
|
|
[21] |
SZEGEDY C, LIU W, JIA Y, et al. Going deeper with convolutions [C]// Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. Boston: IEEE, 2015: 1-9.
|
|
|
[22] |
HU J, SHEN L, SUN G. Squeeze-and-excitation networks [C]// Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. Salt Lake City: IEEE, 2018: 7132-7141.
|
|
|
[23] |
WANG Q, WU B, ZHU P, et al. ECA-Net: efficient channel attention for deep convolutional neural networks [C]// CVF Conference on Computer Vision and Pattern Recognition. Seattle: IEEE, 2020: 11531-11539.
|
|
|
[24] |
XIAO J, LIN Q, YU T, et al. A BCI system for assisting visual fixation assessment in behavioral evaluation of patients with disorders of consciousness [C]// 2017 8th International IEEE/EMBS Conference on Neural Engineering. Shanghai: IEEE, 2017: 399-402.
|
|
|
[25] |
LIU L, WU F X, WANG Y P, et al Multi-receptive-field CNN for semantic segmentation of medical images[J]. IEEE Journal of Biomedical and Health Informatics, 2020, 24 (11): 3215- 3225
doi: 10.1109/JBHI.2020.3016306
|
|
|
[26] |
WANG H, XU J, YAN R, et al A new intelligent bearing fault diagnosis method using SDP representation and SE-CNN[J]. IEEE Transactions on Instrumentation and Measurement, 2019, 69 (5): 2377- 2389
|
|
|
[27] |
LUO Y, LU B L. EEG data augmentation for emotion recognition using a conditional Wasserstein GAN [C]// 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society. Honolulu: IEEE, 2018: 2535-2538.
|
|
|
[28] |
LEE T, KIM M, KIM S P. Data augmentation effects using borderline-SMOTE on classification of a P300-based BCI [C]// 2020 8th International Winter Conference on Brain-Computer Interface. Gangwon: IEEE, 2020
|
|
|
|
Viewed |
|
|
|
Full text
|
|
|
|
|
Abstract
|
|
|
|
|
Cited |
|
|
|
|
|
Shared |
|
|
|
|
|
Discussed |
|
|
|
|