IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0324605.html
   My bibliography  Save this article

Lightweight and efficient skeleton-based sports activity recognition with ASTM-Net

Author

Listed:
  • Bin Wu
  • Mei Xue
  • Ying Jia
  • Ning Zhang
  • GuoJin Zhao
  • XiuPing Wang
  • Chunlei Zhang

Abstract

Human Activity Recognition (HAR) plays a pivotal role in video understanding, with applications ranging from surveillance to virtual reality. Skeletal data has emerged as a robust modality for HAR, overcoming challenges such as noisy backgrounds and lighting variations. However, current Graph Convolutional Network (GCNN)–based methods for skeletal activity recognition face two key limitations: (1) they fail to capture dynamic changes in node affinities induced by movements, and (2) they overlook the interplay between spatial and temporal information critical for recognizing complex actions. To address these challenges, we propose ASTM‑Net, an Activity‑aware SpatioTemporal Multi‑branch graph convolutional network comprising two novel modules. First, the Activity‑aware Spatial Graph convolution Module (ASGM) dynamically models Activity‑Aware Adjacency Graphs (3A‑Graphs) by fusing a manually initialized physical graph, a learnable graph optimized end‑to‑end, and a dynamically inferred, activity‑related graph—thereby capturing evolving spatial affinities. Second, we introduce the Temporal Multi‑branch Graph convolution Module (TMGM), which employs parallel branches of channel‑reduction, dilated temporal convolutions with varied dilation rates, pooling, and pointwise convolutions to effectively model both fine‑grained and long‑range temporal dependencies. This multi‑branch design not only addresses diverse action speeds and durations but also maintains parameter efficiency. By integrating ASGM and TMGM, ASTM‑Net jointly captures spatial–temporal mutualities with significantly reduced computational cost. Extensive experiments on NTU‑RGB + D, NTU‑RGB + D 120, and Toyota Smarthome demonstrate ASTM‑Net’s superiority: it outperforms DualHead‑Net‑ALLs by 0.31% on NTU‑RGB + D X‑Sub and surpasses SkateFormer by 2.22% on Toyota Smarthome Cross‑Subject; it reduces parameters by 51.9% and FLOPs by 49.7% compared to MST‑GCNN‑ALLs while improving accuracy by 0.82%; and under 30% random node occlusion, it achieves 86.94% accuracy—3.49% higher than CBAM‑STGCN.

Suggested Citation

  • Bin Wu & Mei Xue & Ying Jia & Ning Zhang & GuoJin Zhao & XiuPing Wang & Chunlei Zhang, 2025. "Lightweight and efficient skeleton-based sports activity recognition with ASTM-Net," PLOS ONE, Public Library of Science, vol. 20(7), pages 1-30, July.
  • Handle: RePEc:plo:pone00:0324605
    DOI: 10.1371/journal.pone.0324605
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0324605
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0324605&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0324605?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0324605. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.