IDEAS home Printed from https://ideas.repec.org/a/spr/aodasc/v9y2022i4d10.1007_s40745-022-00389-6.html
   My bibliography  Save this article

A Survey for Sparse Regularization Based Compression Methods

Author

Listed:
  • Anda Tang

    (University of Chinese Academy of Sciences)

  • Pei Quan

    (University of Chinese Academy of Sciences)

  • Lingfeng Niu

    (University of Chinese Academy of Sciences)

  • Yong Shi

    (Chinese Academy of Sciences)

Abstract

In recent years, deep neural networks (DNNs) have attracted extensive attention due to their excellent performance in many fields of vision and speech recognition. With the increasing scale of tasks to be solved, the network used is becoming wider and deeper, which requires millions or even billions of parameters. The deep and wide network with many parameters brings the problems of memory requirement, computing overhead and over fitting, which seriously hinder the application of DNNs in practice. Therefore, a natural idea is to train sparse networks and floating-point operators with fewer parameters while maintaining considerable performance. In the past few years, people have done a lot of research in the field of neural network compression, including sparse-inducing methods, quantization, knowledge distillation and so on. And the sparse-inducing methods can be roughly divided into pruning, dropout and sparse regularization based optimization. In this paper, we briefly review and analyze the sparse regularization optimization methods. For the model and optimization method of sparse regularization based compression, we discuss both the different advantages and disadvantages. Finally, we provide some insights and discussions on how to make sparse regularization fit within the compression framework.

Suggested Citation

  • Anda Tang & Pei Quan & Lingfeng Niu & Yong Shi, 2022. "A Survey for Sparse Regularization Based Compression Methods," Annals of Data Science, Springer, vol. 9(4), pages 695-722, August.
  • Handle: RePEc:spr:aodasc:v:9:y:2022:i:4:d:10.1007_s40745-022-00389-6
    DOI: 10.1007/s40745-022-00389-6
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s40745-022-00389-6
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s40745-022-00389-6?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Feng Liu & Yong Shi, 2020. "Investigating Laws of Intelligence Based on AI IQ Research," Annals of Data Science, Springer, vol. 7(3), pages 399-416, September.
    2. Mazumder, Rahul & Friedman, Jerome H. & Hastie, Trevor, 2011. "SparseNet: Coordinate Descent With Nonconvex Penalties," Journal of the American Statistical Association, American Statistical Association, vol. 106(495), pages 1125-1138.
    3. Fan J. & Li R., 2001. "Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 96, pages 1348-1360, December.
    4. Patrick L. Combettes & Jean-Christophe Pesquet, 2011. "Proximal Splitting Methods in Signal Processing," Springer Optimization and Its Applications, in: Heinz H. Bauschke & Regina S. Burachik & Patrick L. Combettes & Veit Elser & D. Russell Luke & Henry (ed.), Fixed-Point Algorithms for Inverse Problems in Science and Engineering, chapter 0, pages 185-212, Springer.
    5. Hui Zou & Trevor Hastie, 2005. "Addendum: Regularization and variable selection via the elastic net," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 67(5), pages 768-768, November.
    6. James M. Tien, 2017. "Internet of Things, Real-Time Decision Making, and Artificial Intelligence," Annals of Data Science, Springer, vol. 4(2), pages 149-178, June.
    7. Peizhuang Wang & He Ouyang & Yixin Zhong & Huacan He, 2016. "Cognition Math Based on Factor Space," Annals of Data Science, Springer, vol. 3(3), pages 281-303, September.
    8. Hui Zou & Trevor Hastie, 2005. "Regularization and variable selection via the elastic net," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 67(2), pages 301-320, April.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Minh Pham & Xiaodong Lin & Andrzej Ruszczyński & Yu Du, 2021. "An outer–inner linearization method for non-convex and nondifferentiable composite regularization problems," Journal of Global Optimization, Springer, vol. 81(1), pages 179-202, September.
    2. Margherita Giuzio, 2017. "Genetic algorithm versus classical methods in sparse index tracking," Decisions in Economics and Finance, Springer;Associazione per la Matematica, vol. 40(1), pages 243-256, November.
    3. VÁZQUEZ-ALCOCER, Alan & SCHOEN, Eric D. & GOOS, Peter, 2018. "A mixed integer optimization approach for model selection in screening experiments," Working Papers 2018007, University of Antwerp, Faculty of Business and Economics.
    4. Siwei Xia & Yuehan Yang & Hu Yang, 2022. "Sparse Laplacian Shrinkage with the Graphical Lasso Estimator for Regression Problems," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 31(1), pages 255-277, March.
    5. Hirose, Kei & Tateishi, Shohei & Konishi, Sadanori, 2013. "Tuning parameter selection in sparse regression modeling," Computational Statistics & Data Analysis, Elsevier, vol. 59(C), pages 28-40.
    6. Yen, Yu-Min & Yen, Tso-Jung, 2014. "Solving norm constrained portfolio optimization via coordinate-wise descent algorithms," Computational Statistics & Data Analysis, Elsevier, vol. 76(C), pages 737-759.
    7. Guangrui Tang & Neng Fan, 2022. "A Survey of Solution Path Algorithms for Regression and Classification Models," Annals of Data Science, Springer, vol. 9(4), pages 749-789, August.
    8. Liqun Yu & Nan Lin, 2017. "ADMM for Penalized Quantile Regression in Big Data," International Statistical Review, International Statistical Institute, vol. 85(3), pages 494-518, December.
    9. Friedman, Jerome H., 2012. "Fast sparse regression and classification," International Journal of Forecasting, Elsevier, vol. 28(3), pages 722-738.
    10. Guan Yu & Yufeng Liu, 2016. "Sparse Regression Incorporating Graphical Structure Among Predictors," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 111(514), pages 707-720, April.
    11. Tutz, Gerhard & Pößnecker, Wolfgang & Uhlmann, Lorenz, 2015. "Variable selection in general multinomial logit models," Computational Statistics & Data Analysis, Elsevier, vol. 82(C), pages 207-222.
    12. Shuichi Kawano, 2014. "Selection of tuning parameters in bridge regression models via Bayesian information criterion," Statistical Papers, Springer, vol. 55(4), pages 1207-1223, November.
    13. Yize Zhao & Matthias Chung & Brent A. Johnson & Carlos S. Moreno & Qi Long, 2016. "Hierarchical Feature Selection Incorporating Known and Novel Biological Information: Identifying Genomic Features Related to Prostate Cancer Recurrence," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 111(516), pages 1427-1439, October.
    14. Qianyun Li & Runmin Shi & Faming Liang, 2019. "Drug sensitivity prediction with high-dimensional mixture regression," PLOS ONE, Public Library of Science, vol. 14(2), pages 1-18, February.
    15. Changrong Yan & Dixin Zhang, 2013. "Sparse dimension reduction for survival data," Computational Statistics, Springer, vol. 28(4), pages 1835-1852, August.
    16. Gareth M. James & Peter Radchenko & Jinchi Lv, 2009. "DASSO: connections between the Dantzig selector and lasso," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 71(1), pages 127-142, January.
    17. Soave, David & Lawless, Jerald F., 2023. "Regularized regression for two phase failure time studies," Computational Statistics & Data Analysis, Elsevier, vol. 182(C).
    18. Zemin Zheng & Jie Zhang & Yang Li, 2022. "L 0 -Regularized Learning for High-Dimensional Additive Hazards Regression," INFORMS Journal on Computing, INFORMS, vol. 34(5), pages 2762-2775, September.
    19. Alexander Chudik & George Kapetanios & M. Hashem Pesaran, 2016. "Big Data Analytics: A New Perspective," CESifo Working Paper Series 5824, CESifo.
    20. Umberto Amato & Anestis Antoniadis & Italia De Feis & Irene Gijbels, 2021. "Penalised robust estimators for sparse and high-dimensional linear models," Statistical Methods & Applications, Springer;Società Italiana di Statistica, vol. 30(1), pages 1-48, March.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:aodasc:v:9:y:2022:i:4:d:10.1007_s40745-022-00389-6. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.