IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0276526.html
   My bibliography  Save this article

Considerate motion imagination classification method using deep learning

Author

Listed:
  • Zhaokun Yan
  • Xiangquan Yang
  • Yu Jin

Abstract

In order to improve the classification accuracy of motion imagination, a considerate motion imagination classification method using deep learning is proposed. Specifically, based on a graph structure suitable for electroencephalography as input, the proposed model can accurately represent the distribution of electroencephalography electrodes in non-Euclidean space and fully consider the spatial correlation between electrodes. In addition, the spatial-spectral-temporal multi-dimensional feature information was extracted from the spatial-temporal graph representation and spatial-spectral graph representation transformed from the original electroencephalography signal using the dual branch architecture. Finally, the attention mechanism and global feature aggregation module were designed and combined with graph convolution to adaptively capture the dynamic correlation intensity and effective feature of electroencephalography signals in various dimensions. A series of contrast experiments and ablation experiments on several different public brain-computer interface datasets demonstrated that the excellence of proposed method. It is worth mentioning that, the proposed model is a general framework for the classification of electroencephalography signals, which is suitable for emotion recognition, sleep staging and other fields based on electroencephalography research. Moreover, the model has the potential to be applied in the medical field of motion imagination rehabilitation in real life.

Suggested Citation

  • Zhaokun Yan & Xiangquan Yang & Yu Jin, 2022. "Considerate motion imagination classification method using deep learning," PLOS ONE, Public Library of Science, vol. 17(10), pages 1-16, October.
  • Handle: RePEc:plo:pone00:0276526
    DOI: 10.1371/journal.pone.0276526
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0276526
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0276526&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0276526?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0276526. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.