Author
Listed:
- Yajian Zhou
(School of Cyberspace Security, Beijing University of Posts and Telecommunications, 1 Nanfeng Rd., Changping District, Beijing 102206, China)
- Zongqian Yue
(School of Cyberspace Security, Beijing University of Posts and Telecommunications, 1 Nanfeng Rd., Changping District, Beijing 102206, China)
- Zhe Chen
(School of Cyberspace Security, Beijing University of Posts and Telecommunications, 1 Nanfeng Rd., Changping District, Beijing 102206, China)
Abstract
With the rapid growth of streaming data, traditional tensor decomposition methods can hardly handle real-time, high-dimensional data of massive amounts in this scenario. In this paper, a two-level parallel incremental tensor Tucker decomposition method with multi-mode growth (TPITTD-MG) is proposed to address the low parallelism issue of the existing Tucker decomposition methods on large-scale, high-dimensional, dynamically growing data. TPITTD-MG involves two mechanisms, i.e., a parallel sub-tensor partitioning mechanism based on the dynamic programming (PSTPA-DP) and a two-level parallel update method for projection matrices and core tensors. The former can count the non-zero elements in a parallel manner and use dynamic programming to partition sub-tensors, which ensures more uniform task allocation. The latter updates the projection matrices or the core tensors by implementing the first level of parallel updates based on the parallel MTTKRP calculation strategy, followed by the second level of parallel updates of different projection matrices or tensors independently based on different classification of sub-tensors. The experimental results show that execution efficiency is improved by nearly 400% and the uniformity of partition results is improved by more than 20% when the data scale reaches an order of magnitude of tens of millions with a parallelism degree of 4, compared with existing algorithms. For third-order tensors, compared with the single-layer update algorithm, execution efficiency is improved by nearly 300%.
Suggested Citation
Yajian Zhou & Zongqian Yue & Zhe Chen, 2025.
"A Two-Level Parallel Incremental Tensor Tucker Decomposition Method with Multi-Mode Growth (TPITTD-MG),"
Mathematics, MDPI, vol. 13(7), pages 1-28, April.
Handle:
RePEc:gam:jmathe:v:13:y:2025:i:7:p:1211-:d:1629719
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jmathe:v:13:y:2025:i:7:p:1211-:d:1629719. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.