IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0332066.html

Dual-branch differential channel hypergraph convolutional network for human skeleton based action recognition

Author

Listed:
  • Dong Chen
  • Kaichen She
  • Peisong Wu
  • Mingdong Chen
  • Chuanqi Li

Abstract

Graph Convolutional Networks (GCNs) perform well in skeleton action recognition tasks, but their pairwise node connections make it difficult to effectively model high-order dependencies between non-adjacent joints. To address this issue, hypergraph methods have emerged with the aim of capturing complex associations between multiple joints. However, existing methods either rely on static hypergraph structures or fail to fully exploit feature interactions between channels, limiting their ability to adapt to complex action patterns. Therefore, we propose the Dual-Branch Differential Channel Hypergraph Convolutional Network (DBC-HCN), which leverages hypergraphs’ ability to represent a priori non-natural dependencies in skeletal structures. It extracts spatio-temporal topological information and higher-order correlations by integrating static and dynamic hypergraphs, leveraging channel optimization and inter-hypergraph feature interactions. Our network comprises two parallel streams: a Spatio-Temporal Dynamic Hypergraph Convolutional Network (ST-HCN) and a Channel-Differential Hypergraph Convolutional Network (CD-HCN). The Spatio-Temporal Dynamic Hypergraph Convolutional stream is mainly based on the natural topology of the human skeleton, and uses dynamic hypergraphs to model the dependencies of skeletal points in spatio-temporal dimensions, so as to accurately capture the spatio-temporal characteristics of the movements. In contrast, Channel-Differential Hypergraph Convolutional stream focuses on the feature differences between different channels and extracts the characteristics of motion changes between individual skeletal points during action execution to enhance the portrayal of action details. In order to enhance the network’s representational capability, we fuse the dual streams with different action feature representations, so that the Spatio-Temporal Dynamic Hypergraph Convolutional stream and the Channel-Differential Hypergraph Convolutional stream learn from each other’s representations to better enrich the action feature representations. We experiment the model on three datasets, Kinetics-Skeleton 400, NTU RGB + D 60 and NTU RGB + D 120, and the results show that our proposed network is more competitive. The accuracy reaches 96.9% and 92.7% for the cross X-View and X-Sub benchmarks of the NTU RGB + D 60 dataset, respectively. Our code is publicly available at: https://github.com/hhh1234hhh/DBC-HCN.

Suggested Citation

  • Dong Chen & Kaichen She & Peisong Wu & Mingdong Chen & Chuanqi Li, 2025. "Dual-branch differential channel hypergraph convolutional network for human skeleton based action recognition," PLOS ONE, Public Library of Science, vol. 20(10), pages 1-16, October.
  • Handle: RePEc:plo:pone00:0332066
    DOI: 10.1371/journal.pone.0332066
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0332066
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0332066&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0332066?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0332066. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.