IDEAS home Printed from https://ideas.repec.org/a/sae/intdis/v15y2019i12p1550147719894532.html
   My bibliography  Save this article

Human action recognition based on low- and high-level data from wearable inertial sensors

Author

Listed:
  • Irvin Hussein Lopez-Nava
  • Angélica Muñoz-Meléndez

Abstract

Human action recognition supported by highly accurate specialized systems, ambulatory systems, or wireless sensor networks has a tremendous potential in the areas of healthcare or wellbeing monitoring. Recently, several studies carried out focused on the recognition of actions using wearable inertial sensors, in which raw sensor data are used to build classification models, and in a few of them high-level representations are obtained which are directly related to anatomical characteristics of the human body. This research focuses on classifying a set of activities of daily living, such as functional mobility, and instrumental activities of daily living, such as preparing meals, performed by test subjects in their homes in naturalistic conditions. The joint angles of upper and lower limbs are estimated using information from five wearable inertial sensors placed on the body of five test subjects. A set of features related to human limb motions is extracted from the orientation signals (high-level data) and another set from the acceleration raw signals (low-level data) and both are used to build classifiers using four inference algorithms. The proposed features in this work are the number of movements and the average duration of consecutive movements. The classifiers are capable of successfully classifying the set of actions using raw data with up to 77.8% and 93.3% from high-level data. This study allowed comparing the use of two data levels to classify a set of actions performed in daily environments using an inertial sensor network.

Suggested Citation

  • Irvin Hussein Lopez-Nava & Angélica Muñoz-Meléndez, 2019. "Human action recognition based on low- and high-level data from wearable inertial sensors," International Journal of Distributed Sensor Networks, , vol. 15(12), pages 15501477198, December.
  • Handle: RePEc:sae:intdis:v:15:y:2019:i:12:p:1550147719894532
    DOI: 10.1177/1550147719894532
    as

    Download full text from publisher

    File URL: https://journals.sagepub.com/doi/10.1177/1550147719894532
    Download Restriction: no

    File URL: https://libkey.io/10.1177/1550147719894532?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:sae:intdis:v:15:y:2019:i:12:p:1550147719894532. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: SAGE Publications (email available below). General contact details of provider: .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.