IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0342793.html

Deep locomotion prediction learning over biosensors, ambient sensors, and computer vision

Author

Listed:
  • Madiha Javeed
  • Ahmad Jalal
  • Dina Abdulaziz AlHammadi
  • Bumshik Lee

Abstract

Innovative technologies for developing intelligent systems related to locomotion prediction learning are crucial in today’s world. Human locomotion involves various complex concepts that must be addressed to enable accurate prediction through learning mechanisms. Our proposed system focuses on locomotion learning through vision RGB devices, ambient sensors-based signals, and physiological motions from biosensing devices. First, the data is acquired from five different scenarios-based datasets. Then, we pre-process the data to mitigate the noise from biosensors and extract body landmarks and key points from computer vision-based signals. The data is then segmented using a data windowing technique. Various features are extracted through multiple combinations of feature extraction methodologies, followed by feature reduction using optimization techniques. In contrast to existing systems, we employ both machine learning and deep learning classifiers for locomotion prediction, utilizing a modified body-specific sensor-based Hidden Markov Model and a deep Exponential Residual Neural Network, respectively. System ontology is also presented to elucidate the relationships among the data, concepts, and objects within the system. Experimental results indicate that our proposed biosensor-based system exhibits significant potential for effective locomotion prediction learning.

Suggested Citation

  • Madiha Javeed & Ahmad Jalal & Dina Abdulaziz AlHammadi & Bumshik Lee, 2026. "Deep locomotion prediction learning over biosensors, ambient sensors, and computer vision," PLOS ONE, Public Library of Science, vol. 21(2), pages 1-29, February.
  • Handle: RePEc:plo:pone00:0342793
    DOI: 10.1371/journal.pone.0342793
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0342793
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0342793&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0342793?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0342793. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.