IDEAS home Printed from https://ideas.repec.org/a/eee/csdana/v71y2014icp771-786.html
   My bibliography  Save this article

Sparse group lasso and high dimensional multinomial classification

Author

Listed:
  • Vincent, Martin
  • Hansen, Niels Richard

Abstract

The sparse group lasso optimization problem is solved using a coordinate gradient descent algorithm. The algorithm is applicable to a broad class of convex loss functions. Convergence of the algorithm is established, and the algorithm is used to investigate the performance of the multinomial sparse group lasso classifier. On three different real data examples the multinomial group lasso clearly outperforms multinomial lasso in terms of achieved classification error rate and in terms of including fewer features for the classification. An implementation of the multinomial sparse group lasso algorithm is available in the R package msgl. Its performance scales well with the problem size as illustrated by one of the examples considered—a 50 class classification problem with 10 k features, which amounts to estimating 500 k parameters.

Suggested Citation

  • Vincent, Martin & Hansen, Niels Richard, 2014. "Sparse group lasso and high dimensional multinomial classification," Computational Statistics & Data Analysis, Elsevier, vol. 71(C), pages 771-786.
  • Handle: RePEc:eee:csdana:v:71:y:2014:i:c:p:771-786
    DOI: 10.1016/j.csda.2013.06.004
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0167947313002168
    Download Restriction: Full text for ScienceDirect subscribers only.

    File URL: https://libkey.io/10.1016/j.csda.2013.06.004?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Simila, Timo & Tikka, Jarkko, 2007. "Input selection and shrinkage in multiresponse linear regression," Computational Statistics & Data Analysis, Elsevier, vol. 52(1), pages 406-422, September.
    2. P. Tseng, 2001. "Convergence of a Block Coordinate Descent Method for Nondifferentiable Minimization," Journal of Optimization Theory and Applications, Springer, vol. 109(3), pages 475-494, June.
    3. Kim, Yongdai & Kwon, Sunghoon & Heun Song, Seuck, 2006. "Multiclass sparse logistic regression for classification of multiple cancer types using gene expression data," Computational Statistics & Data Analysis, Elsevier, vol. 51(3), pages 1643-1655, December.
    4. Lukas Meier & Sara Van De Geer & Peter Bühlmann, 2008. "The group lasso for logistic regression," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 70(1), pages 53-71, February.
    5. Friedman, Jerome H. & Hastie, Trevor & Tibshirani, Rob, 2010. "Regularization Paths for Generalized Linear Models via Coordinate Descent," Journal of Statistical Software, Foundation for Open Access Statistics, vol. 33(i01).
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Tutz, Gerhard & Pößnecker, Wolfgang & Uhlmann, Lorenz, 2015. "Variable selection in general multinomial logit models," Computational Statistics & Data Analysis, Elsevier, vol. 82(C), pages 207-222.
    2. Qing Wang & Dan Zhao, 2019. "Penalization methods with group-wise sparsity: econometric applications to eBay Motors online auctions," Empirical Economics, Springer, vol. 57(2), pages 683-704, August.
    3. Saptarshi Chakraborty & Colin B. Begg & Ronglai Shen, 2021. "Using the “Hidden” genome to improve classification of cancer types," Biometrics, The International Biometric Society, vol. 77(4), pages 1445-1455, December.
    4. Nibbering, Didier & Hastie, Trevor J., 2022. "Multiclass-penalized logistic regression," Computational Statistics & Data Analysis, Elsevier, vol. 169(C).
    5. Rui Wang & Naihua Xiu & Kim-Chuan Toh, 2021. "Subspace quadratic regularization method for group sparse multinomial logistic regression," Computational Optimization and Applications, Springer, vol. 79(3), pages 531-559, July.
    6. Max H. Farrell, 2013. "Robust Inference on Average Treatment Effects with Possibly More Covariates than Observations," Papers 1309.4686, arXiv.org, revised Feb 2018.
    7. Didier Nibbering, 2023. "A High-dimensional Multinomial Logit Model," Monash Econometrics and Business Statistics Working Papers 19/23, Monash University, Department of Econometrics and Business Statistics.
    8. Canhong Wen & Zhenduo Li & Ruipeng Dong & Yijin Ni & Wenliang Pan, 2023. "Simultaneous Dimension Reduction and Variable Selection for Multinomial Logistic Regression," INFORMS Journal on Computing, INFORMS, vol. 35(5), pages 1044-1060, September.
    9. Stolbov, Mikhail & Shchepeleva, Maria, 2020. "What predicts the legal status of cryptocurrencies?," Economic Analysis and Policy, Elsevier, vol. 67(C), pages 273-291.
    10. Park, Beomjin & Park, Changyi, 2023. "Multiclass Laplacian support vector machine with functional analysis of variance decomposition," Computational Statistics & Data Analysis, Elsevier, vol. 187(C).
    11. Farrell, Max H., 2015. "Robust inference on average treatment effects with possibly more covariates than observations," Journal of Econometrics, Elsevier, vol. 189(1), pages 1-23.
    12. Hai-Bin Zhang & Jiao-Jiao Jiang & Yun-Bin Zhao, 2015. "On the proximal Landweber Newton method for a class of nonsmooth convex problems," Computational Optimization and Applications, Springer, vol. 61(1), pages 79-99, May.
    13. Piotr Swierkowski & Adrian Barnett, 2018. "Identification of hospital cost drivers using sparse group lasso," PLOS ONE, Public Library of Science, vol. 13(10), pages 1-19, October.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Tutz, Gerhard & Pößnecker, Wolfgang & Uhlmann, Lorenz, 2015. "Variable selection in general multinomial logit models," Computational Statistics & Data Analysis, Elsevier, vol. 82(C), pages 207-222.
    2. Li, Peili & Jiao, Yuling & Lu, Xiliang & Kang, Lican, 2022. "A data-driven line search rule for support recovery in high-dimensional data analysis," Computational Statistics & Data Analysis, Elsevier, vol. 174(C).
    3. Wei, Fengrong & Zhu, Hongxiao, 2012. "Group coordinate descent algorithms for nonconvex penalized regression," Computational Statistics & Data Analysis, Elsevier, vol. 56(2), pages 316-326.
    4. Zeng, Yaohui & Yang, Tianbao & Breheny, Patrick, 2021. "Hybrid safe–strong rules for efficient optimization in lasso-type problems," Computational Statistics & Data Analysis, Elsevier, vol. 153(C).
    5. Zanhua Yin, 2020. "Variable selection for sparse logistic regression," Metrika: International Journal for Theoretical and Applied Statistics, Springer, vol. 83(7), pages 821-836, October.
    6. Ciarleglio, Adam & Todd Ogden, R., 2016. "Wavelet-based scalar-on-function finite mixture regression models," Computational Statistics & Data Analysis, Elsevier, vol. 93(C), pages 86-96.
    7. Nicholson, William B. & Matteson, David S. & Bien, Jacob, 2017. "VARX-L: Structured regularization for large vector autoregressions with exogenous variables," International Journal of Forecasting, Elsevier, vol. 33(3), pages 627-651.
    8. Pei Wang & Shunjie Chen & Sijia Yang, 2022. "Recent Advances on Penalized Regression Models for Biological Data," Mathematics, MDPI, vol. 10(19), pages 1-24, October.
    9. Simon Hirsch & Florian Ziel, 2022. "Simulation-based Forecasting for Intraday Power Markets: Modelling Fundamental Drivers for Location, Shape and Scale of the Price Distribution," Papers 2211.13002, arXiv.org.
    10. Faisal Zahid & Gerhard Tutz, 2013. "Multinomial logit models with implicit variable selection," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 7(4), pages 393-416, December.
    11. Gerhard Tutz & Gunther Schauberger, 2015. "A Penalty Approach to Differential Item Functioning in Rasch Models," Psychometrika, Springer;The Psychometric Society, vol. 80(1), pages 21-43, March.
    12. Yang, Yuehan & Xia, Siwei & Yang, Hu, 2023. "Multivariate sparse Laplacian shrinkage for joint estimation of two graphical structures," Computational Statistics & Data Analysis, Elsevier, vol. 178(C).
    13. Griveau-Billion, Théophile & Richard, Jean-Charles & Roncalli, Thierry, 2013. "A Fast Algorithm for Computing High-dimensional Risk Parity Portfolios," MPRA Paper 49822, University Library of Munich, Germany.
    14. P. Tseng & S. Yun, 2009. "Block-Coordinate Gradient Descent Method for Linearly Constrained Nonsmooth Separable Optimization," Journal of Optimization Theory and Applications, Springer, vol. 140(3), pages 513-535, March.
    15. Murat Genç, 2022. "A new double-regularized regression using Liu and lasso regularization," Computational Statistics, Springer, vol. 37(1), pages 159-227, March.
    16. Runmin Shi & Faming Liang & Qifan Song & Ye Luo & Malay Ghosh, 2018. "A Blockwise Consistency Method for Parameter Estimation of Complex Models," Sankhya B: The Indian Journal of Statistics, Springer;Indian Statistical Institute, vol. 80(1), pages 179-223, December.
    17. Hendrik van der Wurp & Andreas Groll, 2023. "Introducing LASSO-type penalisation to generalised joint regression modelling for count data," AStA Advances in Statistical Analysis, Springer;German Statistical Society, vol. 107(1), pages 127-151, March.
    18. Yen, Tso-Jung & Yen, Yu-Min, 2016. "Structured variable selection via prior-induced hierarchical penalty functions," Computational Statistics & Data Analysis, Elsevier, vol. 96(C), pages 87-103.
    19. Michoel, Tom, 2016. "Natural coordinate descent algorithm for L1-penalised regression in generalised linear models," Computational Statistics & Data Analysis, Elsevier, vol. 97(C), pages 60-70.
    20. Fellinghauer, Bernd & Bühlmann, Peter & Ryffel, Martin & von Rhein, Michael & Reinhardt, Jan D., 2013. "Stable graphical model estimation with Random Forests for discrete, continuous, and mixed variables," Computational Statistics & Data Analysis, Elsevier, vol. 64(C), pages 132-152.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:csdana:v:71:y:2014:i:c:p:771-786. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/csda .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.