Author
Listed:
- Chao-Lung Yang
(National Taiwan University of Science and Technology)
- Shang-Che Hsu
(National Taiwan University of Science and Technology)
- Yu-Chung Kang
(National Taiwan University of Science and Technology)
- Jing-Feng Nian
(National Taiwan University of Science and Technology)
- Andi Cakravastia
(Bandung Institute of Technology)
Abstract
Applying Human Action Recognition (HAR) in manufacturing site to recognize the human assembling tasks, representing as repetitions of human actions, is an emerging research area. However, human unintentional movements or actions beyond the pre-determined tasks are inevitable and cause the difficulty of applying machine learning model for training. Obviously, exhaustively listing all exceptional actions for training a model is not feasible. Therefore, when utilizing a pre-trained model or re-training a model by considering the limited number of exceptional actions leads to the low accuracy of action recognition. To overcome this challenge, in this work, an unsupervised detection framework named Entropy Signal Clustering (ESC) was proposed to detect the exceptional actions from a repetition of actions which are assumed to be basis of assembling tasks. In order to handle frame-basis detection, first, Temporal Self-similarity Matrix (TSM) from the encoder model of RepNet was constructed to represent the similarities of the pair-wise image frames extracted from a video. Second, a calculating process was re-invented to compensate the misleading similarity caused by the padding window on the edges of TSM. Then, an innovative measure of entropy was proposed to create the indication statistics of action similarity. Finally, a clustering-based time-series anomaly detection was proposed to detect the discordance representing the exceptional action in an entropy time-series data against the repetition of actions. To validate the proposed framework, a simulated videos dataset was created. The results show the proposed ESC method is able to obtain accuracies from 85 to 99% and recall of both exceptional (79–100%) and standard of procedure task (92–100%) actions across all demonstration videos comparing with Histogram, Matrix Profile, Local Recurrence Rate based Discord Search (LRRDS), and LRRDS Sampling K-means (LSK) methods.
Suggested Citation
Chao-Lung Yang & Shang-Che Hsu & Yu-Chung Kang & Jing-Feng Nian & Andi Cakravastia, 2025.
"Unsupervised exceptional human action detection from repetition of human assembling tasks using entropy signal clustering,"
Journal of Intelligent Manufacturing, Springer, vol. 36(6), pages 3801-3815, August.
Handle:
RePEc:spr:joinma:v:36:y:2025:i:6:d:10.1007_s10845-024-02420-4
DOI: 10.1007/s10845-024-02420-4
Download full text from publisher
As the access to this document is restricted, you may want to search for a different version of it.
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:joinma:v:36:y:2025:i:6:d:10.1007_s10845-024-02420-4. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.