Author
Listed:
- Jing Yang
(Qinhuangdao Vocational and Technical College, China)
- Xin Li
(Qinhuangdao Vocational and Technical College, China)
- Wenjing Liu
(Qinhuangdao Vocational and Technical College, China)
- Hongrun Shao
(Qinhuangdao Vocational and Technical College, China)
- Guoxin Li
(Qinhuangdao Vocational and Technical College, China)
Abstract
Intangible Cultural Heritage (ICH) dances, with their richness in historical and cultural value, reflect the diversity of human society. However, many traditional dances face challenges with their transmission and protection. This paper proposes a multi-feature fusion-based motion recognition method to address insufficient feature extraction and inadequate model adaptability for ICH dance movement recognition. The method integrates skeletal, spatiotemporal, and deep features, enhancing their expression through an optimised fusion strategy and using an improved 3D convolutional neural network for efficient recognition. Validation on a dataset of 60 typical movements from various ICH dances including Dai peacock dance, Tibetan Guozhuang dance, Mongolian Andai dance, and Uyghur sainaim dance demonstrated superior performance of this method in accuracy, recall, and F1 score compared to traditional methods. This research provides a robust solution for ICH dance movement recognition and offers insights towards broader technological applications for cultural preservation.
Suggested Citation
Jing Yang & Xin Li & Wenjing Liu & Hongrun Shao & Guoxin Li, 2025.
"Research on Recognition Method of Non-Legacy Dance Action Based on Multi-Feature Fusion,"
International Journal of Intelligent Information Technologies (IJIIT), IGI Global, vol. 21(1), pages 1-16, January.
Handle:
RePEc:igg:jiit00:v:21:y:2025:i:1:p:1-16
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:igg:jiit00:v:21:y:2025:i:1:p:1-16. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Journal Editor (email available below). General contact details of provider: https://www.igi-global.com .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.