Author
Abstract
To preserve the intangible cultural heritage digitally and effectively manage and analyze the intangible cultural heritage video data, the research creatively employs target recognition algorithms and keyframe extraction to perform video extraction and analysis. The keyframe extraction and target detection model is constructed with the help of shot boundary detection, feature pyramid network, and attention mechanism. The experimental results revealed that the designed keyframe extraction model outperformed all the other methods, achieving an accuracy rate of 0.996, a recall rate of 0.984, and an F1 score of 0.936 on the dataset used in the study. This model’s average keyframe redundancy was 0.02, and the missed and false detection rates were both below 0.25. This indicated a strong ability to recognize key content in videos. Meanwhile, the model’s performance changed little under the test with the addition of random noise perturbation, demonstrating good robustness and generalization ability. The detection error converged to the minimum value of 0.126, and the highest value of prediction box generation accuracy could reach 0.834, which was 41.57% improved. In the video processing of intangible cultural heritage, the missing rate and false positive rate of the target object were at the lowest level as low as 0.20. Through keyframe extraction and target detection, the study realizes the effective protection and analysis of intangible cultural heritage cultural videos, and promotes the inheritance and dissemination of intangible cultural heritage.
Suggested Citation
Qingbin Hou, 2025.
"Protection and Analysis of Intangible Cultural Heritage Videos Based on Keyframe Extraction and Adaptive Weight Assignment,"
PLOS ONE, Public Library of Science, vol. 20(8), pages 1-21, August.
Handle:
RePEc:plo:pone00:0330176
DOI: 10.1371/journal.pone.0330176
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0330176. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.