IDEAS home Printed from https://ideas.repec.org/a/inm/orijoc/v34y2022i2p1126-1140.html
   My bibliography  Save this article

A First-Order Optimization Algorithm for Statistical Learning with Hierarchical Sparsity Structure

Author

Listed:
  • Dewei Zhang

    (The Ohio State University, Columbus, Ohio 43210)

  • Yin Liu

    (The Ohio State University, Columbus, Ohio 43210)

  • Sam Davanloo Tajbakhsh

    (The Ohio State University, Columbus, Ohio 43210)

Abstract

In many statistical learning problems, it is desired that the optimal solution conform to an a priori known sparsity structure represented by a directed acyclic graph. Inducing such structures by means of convex regularizers requires nonsmooth penalty functions that exploit group overlapping. Our study focuses on evaluating the proximal operator of the latent overlapping group lasso developed by Jacob et al. in 2009 . We implemented an alternating direction method of multiplier with a sharing scheme to solve large-scale instances of the underlying optimization problem efficiently. In the absence of strong convexity, global linear convergence of the algorithm is established using the error bound theory. More specifically, the paper contributes to establishing primal and dual error bounds when the nonsmooth component in the objective function does not have a polyhedral epigraph. We also investigate the effect of the graph structure on the speed of convergence of the algorithm. Detailed numerical simulation studies over different graph structures supporting the proposed algorithm and two applications in learning are provided. Summary of Contribution: The paper proposes a computationally efficient optimization algorithm to evaluate the proximal operator of a nonsmooth hierarchical sparsity-inducing regularizer and establishes its convergence properties. The computationally intensive subproblem of the proposed algorithm can be fully parallelized, which allows solving large-scale instances of the underlying problem. Comprehensive numerical simulation studies benchmarking the proposed algorithm against five other methods on the speed of convergence to optimality are provided. Furthermore, performance of the algorithm is demonstrated on two statistical learning applications related to topic modeling and breast cancer classification. The code along with the simulation studies and benchmarks are available on the corresponding author’s GitHub website for evaluation and future use.

Suggested Citation

  • Dewei Zhang & Yin Liu & Sam Davanloo Tajbakhsh, 2022. "A First-Order Optimization Algorithm for Statistical Learning with Hierarchical Sparsity Structure," INFORMS Journal on Computing, INFORMS, vol. 34(2), pages 1126-1140, March.
  • Handle: RePEc:inm:orijoc:v:34:y:2022:i:2:p:1126-1140
    DOI: 10.1287/ijoc.2021.1069
    as

    Download full text from publisher

    File URL: http://dx.doi.org/10.1287/ijoc.2021.1069
    Download Restriction: no

    File URL: https://libkey.io/10.1287/ijoc.2021.1069?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Yiyuan She & Zhifeng Wang & He Jiang, 2018. "Group Regularized Estimation Under Structural Hierarchy," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 113(521), pages 445-454, January.
    2. P. Tseng, 2001. "Convergence of a Block Coordinate Descent Method for Nondifferentiable Minimization," Journal of Optimization Theory and Applications, Springer, vol. 109(3), pages 475-494, June.
    3. Zhi-Quan Luo & Paul Tseng, 1993. "On the Convergence Rate of Dual Ascent Methods for Linearly Constrained Convex Minimization," Mathematics of Operations Research, INFORMS, vol. 18(4), pages 846-867, November.
    4. P. Tseng & S. Yun, 2009. "Block-Coordinate Gradient Descent Method for Linearly Constrained Nonsmooth Separable Optimization," Journal of Optimization Theory and Applications, Springer, vol. 140(3), pages 513-535, March.
    5. NESTEROV, Yurii, 2013. "Gradient methods for minimizing composite functions," LIDAM Reprints CORE 2510, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    6. Silvia Villa & Lorenzo Rosasco & Sofia Mosci & Alessandro Verri, 2014. "Proximal methods for the latent group lasso penalty," Computational Optimization and Applications, Springer, vol. 58(2), pages 381-407, June.
    7. Radchenko, Peter & James, Gareth M., 2010. "Variable Selection Using Adaptive Nonlinear Interaction Structures in High Dimensions," Journal of the American Statistical Association, American Statistical Association, vol. 105(492), pages 1541-1553.
    8. Ming Yuan & Yi Lin, 2006. "Model selection and estimation in regression with grouped variables," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 68(1), pages 49-67, February.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Christian Kanzow & Theresa Lechner, 2021. "Globalized inexact proximal Newton-type methods for nonconvex composite functions," Computational Optimization and Applications, Springer, vol. 78(2), pages 377-410, March.
    2. Jacob Bien & Florentina Bunea & Luo Xiao, 2016. "Convex Banding of the Covariance Matrix," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 111(514), pages 834-845, April.
    3. Wang, Cheng & Chen, Haozhe & Jiang, Binyan, 2024. "HiQR: An efficient algorithm for high-dimensional quadratic regression with penalties," Computational Statistics & Data Analysis, Elsevier, vol. 192(C).
    4. Masoud Ahookhosh & Le Thi Khanh Hien & Nicolas Gillis & Panagiotis Patrinos, 2021. "Multi-block Bregman proximal alternating linearized minimization and its application to orthogonal nonnegative matrix factorization," Computational Optimization and Applications, Springer, vol. 79(3), pages 681-715, July.
    5. Mingrui Zhong & Zanhua Yin & Zhichao Wang, 2023. "Variable Selection for Sparse Logistic Regression with Grouped Variables," Mathematics, MDPI, vol. 11(24), pages 1-21, December.
    6. Feng Li & Yajie Li & Sanying Feng, 2021. "Estimation for Varying Coefficient Models with Hierarchical Structure," Mathematics, MDPI, vol. 9(2), pages 1-18, January.
    7. Mingyi Hong & Tsung-Hui Chang & Xiangfeng Wang & Meisam Razaviyayn & Shiqian Ma & Zhi-Quan Luo, 2020. "A Block Successive Upper-Bound Minimization Method of Multipliers for Linearly Constrained Convex Optimization," Mathematics of Operations Research, INFORMS, vol. 45(3), pages 833-861, August.
    8. Jun Yan & Jian Huang, 2012. "Model Selection for Cox Models with Time-Varying Coefficients," Biometrics, The International Biometric Society, vol. 68(2), pages 419-428, June.
    9. Le Thi Khanh Hien & Duy Nhat Phan & Nicolas Gillis, 2022. "Inertial alternating direction method of multipliers for non-convex non-smooth optimization," Computational Optimization and Applications, Springer, vol. 83(1), pages 247-285, September.
    10. Loann David Denis Desboulets, 2018. "A Review on Variable Selection in Regression Analysis," Econometrics, MDPI, vol. 6(4), pages 1-27, November.
    11. Liu, Yulan & Bi, Shujun, 2019. "Error bounds for non-polyhedral convex optimization and applications to linear convergence of FDM and PGM," Applied Mathematics and Computation, Elsevier, vol. 358(C), pages 418-435.
    12. Ion Necoara & Andrei Patrascu, 2014. "A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints," Computational Optimization and Applications, Springer, vol. 57(2), pages 307-337, March.
    13. Nicholson, William B. & Matteson, David S. & Bien, Jacob, 2017. "VARX-L: Structured regularization for large vector autoregressions with exogenous variables," International Journal of Forecasting, Elsevier, vol. 33(3), pages 627-651.
    14. Pei Wang & Shunjie Chen & Sijia Yang, 2022. "Recent Advances on Penalized Regression Models for Biological Data," Mathematics, MDPI, vol. 10(19), pages 1-24, October.
    15. David Degras, 2021. "Sparse group fused lasso for model segmentation: a hybrid approach," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 15(3), pages 625-671, September.
    16. Masoud Ahookhosh & Le Thi Khanh Hien & Nicolas Gillis & Panagiotis Patrinos, 2021. "A Block Inertial Bregman Proximal Algorithm for Nonsmooth Nonconvex Problems with Application to Symmetric Nonnegative Matrix Tri-Factorization," Journal of Optimization Theory and Applications, Springer, vol. 190(1), pages 234-258, July.
    17. Li Yun & O’Connor George T. & Dupuis Josée & Kolaczyk Eric, 2015. "Modeling gene-covariate interactions in sparse regression with group structure for genome-wide association studies," Statistical Applications in Genetics and Molecular Biology, De Gruyter, vol. 14(3), pages 265-277, June.
    18. Ching-pei Lee & Stephen J. Wright, 2019. "Inexact Successive quadratic approximation for regularized optimization," Computational Optimization and Applications, Springer, vol. 72(3), pages 641-674, April.
    19. Jonathan Boss & Alexander Rix & Yin‐Hsiu Chen & Naveen N. Narisetty & Zhenke Wu & Kelly K. Ferguson & Thomas F. McElrath & John D. Meeker & Bhramar Mukherjee, 2021. "A hierarchical integrative group least absolute shrinkage and selection operator for analyzing environmental mixtures," Environmetrics, John Wiley & Sons, Ltd., vol. 32(8), December.
    20. Kimon Fountoulakis & Jacek Gondzio, 2016. "Performance of first- and second-order methods for $$\ell _1$$ ℓ 1 -regularized least squares problems," Computational Optimization and Applications, Springer, vol. 65(3), pages 605-635, December.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:inm:orijoc:v:34:y:2022:i:2:p:1126-1140. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Chris Asher (email available below). General contact details of provider: https://edirc.repec.org/data/inforea.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.