IDEAS home Printed from https://ideas.repec.org/a/plo/pdig00/0000361.html
   My bibliography  Save this article

Recognizing hand use and hand role at home after stroke from egocentric video

Author

Listed:
  • Meng-Fen Tsai
  • Rosalie H Wang
  • José Zariffa

Abstract

Hand function is a central determinant of independence after stroke. Measuring hand use in the home environment is necessary to evaluate the impact of new interventions, and calls for novel wearable technologies. Egocentric video can capture hand-object interactions in context, as well as show how more-affected hands are used during bilateral tasks (for stabilization or manipulation). Automated methods are required to extract this information. The objective of this study was to use artificial intelligence-based computer vision to classify hand use and hand role from egocentric videos recorded at home after stroke. Twenty-one stroke survivors participated in the study. A random forest classifier, a SlowFast neural network, and the Hand Object Detector neural network were applied to identify hand use and hand role at home. Leave-One-Subject-Out-Cross-Validation (LOSOCV) was used to evaluate the performance of the three models. Between-group differences of the models were calculated based on the Mathews correlation coefficient (MCC). For hand use detection, the Hand Object Detector had significantly higher performance than the other models. The macro average MCCs using this model in the LOSOCV were 0.50 ± 0.23 for the more-affected hands and 0.58 ± 0.18 for the less-affected hands. Hand role classification had macro average MCCs in the LOSOCV that were close to zero for all models. Using egocentric video to capture the hand use of stroke survivors at home is technically feasible. Pose estimation to track finger movements may be beneficial to classifying hand roles in the future.Author summary: This study examines the technical feasibility of a home-based hand assessment that may be used in the future for stroke survivors. Wearable sensors for motion detection have been applied for assessment purposes, but because these devices detect only dynamic movements and fail to capture static movements, they do not fully reflect the dexterity that is unique to hands. A wearable (egocentric) camera can be used to record both types of hand movements and provide contextual information, such as what task is being carried out and how the hands carry out the task. This information is essential to understanding hand recovery levels after stroke. Demonstrating that egocentric video analysis can be used to quantify hand movements is important. We analyzed the self-recorded videos from community-dwelling stroke survivors at home using artificial intelligence methods and successfully detected the hand use of more-affected and less-affected hands during typical routines. Therefore, the automated analysis of egocentric video has potential to be further applied in the long-term tracking of hand recovery in the community. As a future assessment tool, the camera can ultimately evaluate the impact of interventions and help to develop new interventions.

Suggested Citation

  • Meng-Fen Tsai & Rosalie H Wang & José Zariffa, 2023. "Recognizing hand use and hand role at home after stroke from egocentric video," PLOS Digital Health, Public Library of Science, vol. 2(10), pages 1-23, October.
  • Handle: RePEc:plo:pdig00:0000361
    DOI: 10.1371/journal.pdig.0000361
    as

    Download full text from publisher

    File URL: https://journals.plos.org/digitalhealth/article?id=10.1371/journal.pdig.0000361
    Download Restriction: no

    File URL: https://journals.plos.org/digitalhealth/article/file?id=10.1371/journal.pdig.0000361&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pdig.0000361?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pdig00:0000361. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: digitalhealth (email available below). General contact details of provider: https://journals.plos.org/digitalhealth .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.