IDEAS home Printed from https://ideas.repec.org/a/gam/jmathe/v10y2022i17p3131-d903572.html
   My bibliography  Save this article

Multi-Channel EEG Emotion Recognition Based on Parallel Transformer and 3D-Convolutional Neural Network

Author

Listed:
  • Jie Sun

    (School of Information and Control Engineering, Qingdao University of Technology, Qingdao 266033, China)

  • Xuan Wang

    (School of Information and Control Engineering, Qingdao University of Technology, Qingdao 266033, China)

  • Kun Zhao

    (School of Information and Control Engineering, Qingdao University of Technology, Qingdao 266033, China)

  • Siyuan Hao

    (School of Information and Control Engineering, Qingdao University of Technology, Qingdao 266033, China)

  • Tianyu Wang

    (School of Information and Control Engineering, Qingdao University of Technology, Qingdao 266033, China)

Abstract

Due to its covert and real-time properties, electroencephalography (EEG) has long been the medium of choice for emotion identification research. Currently, EEG-based emotion recognition focuses on exploiting temporal, spatial, and spatiotemporal EEG data for emotion recognition. Due to the lack of consideration of both spatial and temporal aspects of EEG data, the accuracy of EEG emotion detection algorithms employing solely spatial or temporal variables is low. In addition, approaches that use spatiotemporal properties of EEG for emotion recognition take temporal and spatial characteristics of EEG into account; however, these methods extract temporal and spatial information directly from EEG data. Since there is no reconstruction of the EEG data format, the temporal and spatial properties of the EEG data cannot be extracted efficiently. To address the aforementioned issues, this research proposes a multi-channel EEG emotion identification model based on the parallel transformer and three-dimensional convolutional neural networks (3D-CNN). First, parallel channel EEG data and position reconstruction EEG sequence data are created separately. The temporal and spatial characteristics of EEG are then retrieved using transformer and 3D-CNN models. Finally, the features of the two parallel modules are combined to form the final features for emotion recognition. On the DEAP, Dreamer, and SEED databases, the technique achieved greater accuracy in emotion recognition than other methods. It demonstrates the efficiency of the strategy described in this paper.

Suggested Citation

  • Jie Sun & Xuan Wang & Kun Zhao & Siyuan Hao & Tianyu Wang, 2022. "Multi-Channel EEG Emotion Recognition Based on Parallel Transformer and 3D-Convolutional Neural Network," Mathematics, MDPI, vol. 10(17), pages 1-15, September.
  • Handle: RePEc:gam:jmathe:v:10:y:2022:i:17:p:3131-:d:903572
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2227-7390/10/17/3131/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2227-7390/10/17/3131/
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Hao Chao & Liang Dong & Yongli Liu & Baoyun Lu, 2020. "Improved Deep Feature Learning by Synchronization Measurements for Multi-Channel EEG Emotion Recognition," Complexity, Hindawi, vol. 2020, pages 1-15, March.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.

      Corrections

      All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jmathe:v:10:y:2022:i:17:p:3131-:d:903572. See general information about how to correct material in RePEc.

      If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

      If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

      If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

      For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

      Please note that corrections may take a couple of weeks to filter through the various RePEc services.

      IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.