IDEAS home Printed from https://ideas.repec.org/a/spr/coopap/v58y2014i2p381-407.html
   My bibliography  Save this article

Proximal methods for the latent group lasso penalty

Author

Listed:
  • Silvia Villa
  • Lorenzo Rosasco
  • Sofia Mosci
  • Alessandro Verri

Abstract

We consider a regularized least squares problem, with regularization by structured sparsity-inducing norms, which extend the usual ℓ 1 and the group lasso penalty, by allowing the subsets to overlap. Such regularizations lead to nonsmooth problems that are difficult to optimize, and we propose in this paper a suitable version of an accelerated proximal method to solve them. We prove convergence of a nested procedure, obtained composing an accelerated proximal method with an inner algorithm for computing the proximity operator. By exploiting the geometrical properties of the penalty, we devise a new active set strategy, thanks to which the inner iteration is relatively fast, thus guaranteeing good computational performances of the overall algorithm. Our approach allows to deal with high dimensional problems without pre-processing for dimensionality reduction, leading to better computational and prediction performances with respect to the state-of-the art methods, as shown empirically both on toy and real data. Copyright Springer Science+Business Media New York 2014

Suggested Citation

  • Silvia Villa & Lorenzo Rosasco & Sofia Mosci & Alessandro Verri, 2014. "Proximal methods for the latent group lasso penalty," Computational Optimization and Applications, Springer, vol. 58(2), pages 381-407, June.
  • Handle: RePEc:spr:coopap:v:58:y:2014:i:2:p:381-407
    DOI: 10.1007/s10589-013-9628-6
    as

    Download full text from publisher

    File URL: http://hdl.handle.net/10.1007/s10589-013-9628-6
    Download Restriction: Access to full text is restricted to subscribers.

    File URL: https://libkey.io/10.1007/s10589-013-9628-6?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Lukas Meier & Sara Van De Geer & Peter Bühlmann, 2008. "The group lasso for logistic regression," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 70(1), pages 53-71, February.
    2. Unknown, 2005. "Forward," 2005 Conference: Slovenia in the EU - Challenges for Agriculture, Food Science and Rural Affairs, November 10-11, 2005, Moravske Toplice, Slovenia 183804, Slovenian Association of Agricultural Economists (DAES).
    3. NESTEROV, Yu., 2007. "Gradient methods for minimizing composite objective function," LIDAM Discussion Papers CORE 2007076, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    4. NESTEROV, Yu., 2005. "Smooth minimization of non-smooth functions," LIDAM Reprints CORE 1819, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    5. Ming Yuan & Yi Lin, 2006. "Model selection and estimation in regression with grouped variables," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 68(1), pages 49-67, February.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Anastasis Kratsios & Cody B. Hyndman, 2017. "Non-Euclidean Conditional Expectation and Filtering," Papers 1710.05829, arXiv.org, revised Sep 2018.
    2. Dewei Zhang & Yin Liu & Sam Davanloo Tajbakhsh, 2022. "A First-Order Optimization Algorithm for Statistical Learning with Hierarchical Sparsity Structure," INFORMS Journal on Computing, INFORMS, vol. 34(2), pages 1126-1140, March.
    3. Christian Grussler & Pontus Giselsson, 2022. "Efficient Proximal Mapping Computation for Low-Rank Inducing Norms," Journal of Optimization Theory and Applications, Springer, vol. 192(1), pages 168-194, January.
    4. Du, Yu & Lin, Xiaodong & Pham, Minh & Ruszczyński, Andrzej, 2021. "Selective linearization for multi-block statistical learning," European Journal of Operational Research, Elsevier, vol. 293(1), pages 219-228.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Christian Kanzow & Theresa Lechner, 2021. "Globalized inexact proximal Newton-type methods for nonconvex composite functions," Computational Optimization and Applications, Springer, vol. 78(2), pages 377-410, March.
    2. Haibin Zhang & Juan Wei & Meixia Li & Jie Zhou & Miantao Chao, 2014. "On proximal gradient method for the convex problems regularized with the group reproducing kernel norm," Journal of Global Optimization, Springer, vol. 58(1), pages 169-188, January.
    3. Qihang Lin & Xi Chen & Javier Peña, 2014. "A sparsity preserving stochastic gradient methods for sparse regression," Computational Optimization and Applications, Springer, vol. 58(2), pages 455-482, June.
    4. Tutz, Gerhard & Pößnecker, Wolfgang & Uhlmann, Lorenz, 2015. "Variable selection in general multinomial logit models," Computational Statistics & Data Analysis, Elsevier, vol. 82(C), pages 207-222.
    5. Ye, Ya-Fen & Shao, Yuan-Hai & Deng, Nai-Yang & Li, Chun-Na & Hua, Xiang-Yu, 2017. "Robust Lp-norm least squares support vector regression with feature selection," Applied Mathematics and Computation, Elsevier, vol. 305(C), pages 32-52.
    6. Caner, Mehmet, 2023. "Generalized linear models with structured sparsity estimators," Journal of Econometrics, Elsevier, vol. 236(2).
    7. Bilin Zeng & Xuerong Meggie Wen & Lixing Zhu, 2017. "A link-free sparse group variable selection method for single-index model," Journal of Applied Statistics, Taylor & Francis Journals, vol. 44(13), pages 2388-2400, October.
    8. Olga Klopp & Marianna Pensky, 2013. "Sparse High-dimensional Varying Coefficient Model : Non-asymptotic Minimax Study," Working Papers 2013-30, Center for Research in Economics and Statistics.
    9. Jiang, Cuixia & Xiong, Wei & Xu, Qifa & Liu, Yezheng, 2021. "Predicting default of listed companies in mainland China via U-MIDAS Logit model with group lasso penalty," Finance Research Letters, Elsevier, vol. 38(C).
    10. Osamu Komori & Shinto Eguchi & John B. Copas, 2015. "Generalized t-statistic for two-group classification," Biometrics, The International Biometric Society, vol. 71(2), pages 404-416, June.
    11. Wei, Fengrong & Zhu, Hongxiao, 2012. "Group coordinate descent algorithms for nonconvex penalized regression," Computational Statistics & Data Analysis, Elsevier, vol. 56(2), pages 316-326.
    12. Luu, Tung Duy & Fadili, Jalal & Chesneau, Christophe, 2019. "PAC-Bayesian risk bounds for group-analysis sparse regression by exponential weighting," Journal of Multivariate Analysis, Elsevier, vol. 171(C), pages 209-233.
    13. Zeng, Yaohui & Yang, Tianbao & Breheny, Patrick, 2021. "Hybrid safe–strong rules for efficient optimization in lasso-type problems," Computational Statistics & Data Analysis, Elsevier, vol. 153(C).
    14. Lee, In Gyu & Yoon, Sang Won & Won, Daehan, 2022. "A Mixed Integer Linear Programming Support Vector Machine for Cost-Effective Group Feature Selection: Branch-Cut-and-Price Approach," European Journal of Operational Research, Elsevier, vol. 299(3), pages 1055-1068.
    15. Ruidi Chen & Ioannis Ch. Paschalidis, 2022. "Robust Grouped Variable Selection Using Distributionally Robust Optimization," Journal of Optimization Theory and Applications, Springer, vol. 194(3), pages 1042-1071, September.
    16. Zanhua Yin, 2020. "Variable selection for sparse logistic regression," Metrika: International Journal for Theoretical and Applied Statistics, Springer, vol. 83(7), pages 821-836, October.
    17. A. Karagrigoriou & C. Koukouvinos & K. Mylona, 2010. "On the advantages of the non-concave penalized likelihood model selection method with minimum prediction errors in large-scale medical studies," Journal of Applied Statistics, Taylor & Francis Journals, vol. 37(1), pages 13-24.
    18. Liu, Jianyu & Yu, Guan & Liu, Yufeng, 2019. "Graph-based sparse linear discriminant analysis for high-dimensional classification," Journal of Multivariate Analysis, Elsevier, vol. 171(C), pages 250-269.
    19. Shota Yamanaka & Nobuo Yamashita, 2018. "Duality of nonconvex optimization with positively homogeneous functions," Computational Optimization and Applications, Springer, vol. 71(2), pages 435-456, November.
    20. Lingxue Zhang & Seyoung Kim, 2014. "Learning Gene Networks under SNP Perturbations Using eQTL Datasets," PLOS Computational Biology, Public Library of Science, vol. 10(2), pages 1-20, February.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:coopap:v:58:y:2014:i:2:p:381-407. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.