IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0333805.html

SMMTM: Motor imagery EEG decoding algorithm using a hybrid multi-branch separable convolutional self-attention temporal convolutional network

Author

Listed:
  • DianGuo Cao
  • ZhenYuan Yu
  • Jinqiang Wang
  • Yuqiang Wu

Abstract

Motor imagery (MI) is a brain-computer interface (BCI) technology with the potential to change human life in the future. MI signals have been widely applied in various BCI applications, including neurorehabilitation, smart home control, and prosthetic control. However, the limited accuracy of MI signals decoding remains a significant barrier to the broader growth of the BCI applications. In this study, we propose the SMMTM model, which combines spatiotemporal convolution (SC), multi-branch separable convolution (MSC), multi-head self-attention (MSA), temporal convolution network (TCN), and multimodal feature fusion (MFF). Specifically, we use the SC module to capture both temporal and spatial features. We design a MSC to capture temporal features at multiple scales. In addition, MSA is designed to extract valuable global features with long-term dependence. The TCN is employed to capture higher-level temporal features. The MFF consists of feature fusion and decision fusion, using the features output from the SMMTM to improve robustness. The SMMTM was evaluated on the public benchmark BCI Comparison IV 2a and 2b datasets, the results showed that the within-subject classification accuracies for the datasets were 84.96% and 89.26% respectively, with kappa values of 0.797 and 0.756. The cross-subject classification accuracy for the 2a dataset was 69.21%, with a kappa value of 0.584. These results indicate that the SMMTM significantly enhances decoding performance, providing a strong foundation for advancing practical BCI implementations.

Suggested Citation

  • DianGuo Cao & ZhenYuan Yu & Jinqiang Wang & Yuqiang Wu, 2025. "SMMTM: Motor imagery EEG decoding algorithm using a hybrid multi-branch separable convolutional self-attention temporal convolutional network," PLOS ONE, Public Library of Science, vol. 20(10), pages 1-19, October.
  • Handle: RePEc:plo:pone00:0333805
    DOI: 10.1371/journal.pone.0333805
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0333805
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0333805&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0333805?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0333805. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.