IDEAS home Printed from https://ideas.repec.org/a/hin/complx/8672431.html
   My bibliography  Save this article

Hybrid Deep-Learning Framework Based on Gaussian Fusion of Multiple Spatiotemporal Networks for Walking Gait Phase Recognition

Author

Listed:
  • Tao Zhen
  • Jian-lei Kong
  • Lei Yan

Abstract

Human gait phase detection is a significance technology for robotics exoskeletons control and exercise rehabilitation therapy. Inertial Measurement Units (IMUs) with accelerometer and gyroscope are convenient and inexpensive to collect gait data, which are often used to analyze gait dynamics for personal daily applications. However, current deep-learning methods that extract spatial and the isolated temporal features can easily ignore the correlation that may exist in the high-dimensional space, which limits the recognition effect of a single model. In this study, an effective hybrid deep-learning framework based on Gaussian probability fusion of multiple spatiotemporal networks (GFM-Net) is proposed to detect different gait phases from multisource IMU signals. Furthermore, it first employs the gait information acquisition system to collect IMU data fixed on lower limb. With the data preprocessing, the framework constructs a spatial feature extractor with AutoEncoder and CNN modules and a multistream temporal feature extractor with three collateral modules combining RNN, LSTM, and GRU modules. Finally, the novel Gaussian probability fusion module optimized by the Expectation-Maximum (EM) algorithm is developed to integrate the different feature maps output by the three submodels and continues to realize gait recognition. The framework proposed in this paper implements the inner loop that also contains the EM algorithm in the outer loop and optimizes the reverse gradient in the entire network. Experiments show that this method has better performance in gait classification with accuracy reaching more than 96.7%.

Suggested Citation

  • Tao Zhen & Jian-lei Kong & Lei Yan, 2020. "Hybrid Deep-Learning Framework Based on Gaussian Fusion of Multiple Spatiotemporal Networks for Walking Gait Phase Recognition," Complexity, Hindawi, vol. 2020, pages 1-17, October.
  • Handle: RePEc:hin:complx:8672431
    DOI: 10.1155/2020/8672431
    as

    Download full text from publisher

    File URL: http://downloads.hindawi.com/journals/8503/2020/8672431.pdf
    Download Restriction: no

    File URL: http://downloads.hindawi.com/journals/8503/2020/8672431.xml
    Download Restriction: no

    File URL: https://libkey.io/10.1155/2020/8672431?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Xue-Bo Jin & Wei-Zhen Zheng & Jian-Lei Kong & Xiao-Yi Wang & Min Zuo & Qing-Chuan Zhang & Seng Lin, 2021. "Deep-Learning Temporal Predictor via Bidirectional Self-Attentive Encoder–Decoder Framework for IOT-Based Environmental Sensing in Intelligent Greenhouse," Agriculture, MDPI, vol. 11(8), pages 1-25, August.
    2. Jianlei Kong & Hongxing Wang & Chengcai Yang & Xuebo Jin & Min Zuo & Xin Zhang, 2022. "A Spatial Feature-Enhanced Attention Neural Network with High-Order Pooling Representation for Application in Pest and Disease Recognition," Agriculture, MDPI, vol. 12(4), pages 1-30, March.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:hin:complx:8672431. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Mohamed Abdelhakeem (email available below). General contact details of provider: https://www.hindawi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.