STM Article Repository

Jiang, Lei and Siriaraya, Panote and Choi, Dongeun and Zeng, Fangmeng and Kuwahara, Noriaki (2022) Electroencephalogram signals emotion recognition based on convolutional neural network-recurrent neural network framework with channel-temporal attention mechanism for older adults. Frontiers in Aging Neuroscience, 14. ISSN 1663-4365

[thumbnail of pubmed-zip/versions/1/package-entries/fnagi-14-945024/fnagi-14-945024.pdf] Text
pubmed-zip/versions/1/package-entries/fnagi-14-945024/fnagi-14-945024.pdf - Published Version

Download (5MB)

Abstract

Reminiscence and conversation between older adults and younger volunteers using past photographs are very effective in improving the emotional state of older adults and alleviating depression. However, we need to evaluate the emotional state of the older adult while conversing on the past photographs. While electroencephalogram (EEG) has a significantly stronger association with emotion than other physiological signals, the challenge is to eliminate muscle artifacts in the EEG during speech as well as to reduce the number of dry electrodes to improve user comfort while maintaining high emotion recognition accuracy. Therefore, we proposed the CTA-CNN-Bi-LSTM emotion recognition framework. EEG signals of eight channels (P3, P4, F3, F4, F7, F8, T7, and T8) were first implemented in the MEMD-CCA method on three brain regions separately (Frontal, Temporal, Parietal) to remove the muscle artifacts then were fed into the Channel-Temporal attention module to get the weights of channels and temporal points most relevant to the positive, negative and neutral emotions to recode the EEG data. A Convolutional Neural Networks (CNNs) module then extracted the spatial information in the new EEG data to obtain the spatial feature maps which were then sequentially inputted into a Bi-LSTM module to learn the bi-directional temporal information for emotion recognition. Finally, we designed four group experiments to demonstrate that the proposed CTA-CNN-Bi-LSTM framework outperforms the previous works. And the highest average recognition accuracy of the positive, negative, and neutral emotions achieved 98.75%.

Item Type: Article
Subjects: GO for ARCHIVE > Medical Science
Depositing User: Unnamed user with email support@goforarchive.com
Date Deposited: 10 Oct 2023 05:44
Last Modified: 10 Oct 2023 05:44
URI: http://eprints.go4mailburst.com/id/eprint/1053

Actions (login required)

View Item
View Item