IDEAS home Printed from https://ideas.repec.org/a/eee/csdana/v92y2015icp53-67.html
   My bibliography  Save this article

Moderately clipped LASSO

Author

Listed:
  • Kwon, Sunghoon
  • Lee, Sangin
  • Kim, Yongdai

Abstract

The least absolute shrinkage and selection operator (LASSO) has been widely used in high-dimensional linear regression models. However, it is known that the LASSO selects too many noisy variables. In this paper, we propose a new estimator, the moderately clipped LASSO (MCL), that deletes noisy variables successively without sacrificing prediction accuracy much. Various numerical studies are done to illustrate superiority of the MCL over other competitors.

Suggested Citation

  • Kwon, Sunghoon & Lee, Sangin & Kim, Yongdai, 2015. "Moderately clipped LASSO," Computational Statistics & Data Analysis, Elsevier, vol. 92(C), pages 53-67.
  • Handle: RePEc:eee:csdana:v:92:y:2015:i:c:p:53-67
    DOI: 10.1016/j.csda.2015.07.001
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0167947315001589
    Download Restriction: Full text for ScienceDirect subscribers only.

    File URL: https://libkey.io/10.1016/j.csda.2015.07.001?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Zou, Hui, 2006. "The Adaptive Lasso and Its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 101, pages 1418-1429, December.
    2. Yuhong Yang, 2005. "Can the strengths of AIC and BIC be shared? A conflict between model indentification and regression estimation," Biometrika, Biometrika Trust, vol. 92(4), pages 937-950, December.
    3. Leeb, Hannes & Potscher, Benedikt M., 2008. "Sparse estimators and the oracle property, or the return of Hodges' estimator," Journal of Econometrics, Elsevier, vol. 142(1), pages 201-211, January.
    4. Yongdai Kim & Sunghoon Kwon, 2012. "Global optimality of nonconvex penalized estimators," Biometrika, Biometrika Trust, vol. 99(2), pages 315-325.
    5. Kim, Yongdai & Choi, Hosik & Oh, Hee-Seok, 2008. "Smoothly Clipped Absolute Deviation on High Dimensions," Journal of the American Statistical Association, American Statistical Association, vol. 103(484), pages 1665-1673.
    6. Pötscher, Benedikt M. & Leeb, Hannes, 2009. "On the distribution of penalized maximum likelihood estimators: The LASSO, SCAD, and thresholding," Journal of Multivariate Analysis, Elsevier, vol. 100(9), pages 2065-2082, October.
    7. Fan J. & Li R., 2001. "Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 96, pages 1348-1360, December.
    8. Wang, Hansheng & Leng, Chenlei, 2007. "Unified LASSO Estimation by Least Squares Approximation," Journal of the American Statistical Association, American Statistical Association, vol. 102, pages 1039-1048, September.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Yang Peng & Bin Luo & Xiaoli Gao, 2022. "Robust Moderately Clipped LASSO for Simultaneous Outlier Detection and Variable Selection," Sankhya B: The Indian Journal of Statistics, Springer;Indian Statistical Institute, vol. 84(2), pages 694-707, November.
    2. Sunghoon Kwon & Jeongyoun Ahn & Woncheol Jang & Sangin Lee & Yongdai Kim, 2017. "A doubly sparse approach for group variable selection," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 69(5), pages 997-1025, October.
    3. Joaquim Fernando Pinto da Costa & Manuel Cabral, 2022. "Statistical Methods with Applications in Data Mining: A Review of the Most Recent Works," Mathematics, MDPI, vol. 10(6), pages 1-22, March.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Pötscher, Benedikt M., 2007. "Confidence Sets Based on Sparse Estimators Are Necessarily Large," MPRA Paper 5677, University Library of Munich, Germany.
    2. Xianyi Wu & Xian Zhou, 2019. "On Hodges’ superefficiency and merits of oracle property in model selection," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 71(5), pages 1093-1119, October.
    3. Lee, Sangin & Kim, Yongdai & Kwon, Sunghoon, 2012. "Quadratic approximation for nonconvex penalized estimations with a diverging number of parameters," Statistics & Probability Letters, Elsevier, vol. 82(9), pages 1710-1717.
    4. Anders Bredahl Kock, 2012. "On the Oracle Property of the Adaptive Lasso in Stationary and Nonstationary Autoregressions," CREATES Research Papers 2012-05, Department of Economics and Business Economics, Aarhus University.
    5. Li, Xinjue & Zboňáková, Lenka & Wang, Weining & Härdle, Wolfgang Karl, 2019. "Combining Penalization and Adaption in High Dimension with Application in Bond Risk Premia Forecasting," IRTG 1792 Discussion Papers 2019-030, Humboldt University of Berlin, International Research Training Group 1792 "High Dimensional Nonstationary Time Series".
    6. Marcelo C. Medeiros & Eduardo F. Mendes, 2015. "l1-Regularization of High-Dimensional Time-Series Models with Flexible Innovations," Textos para discussão 636, Department of Economics PUC-Rio (Brazil).
    7. Ricardo P. Masini & Marcelo C. Medeiros & Eduardo F. Mendes, 2023. "Machine learning advances for time series forecasting," Journal of Economic Surveys, Wiley Blackwell, vol. 37(1), pages 76-111, February.
    8. Schneider Ulrike & Wagner Martin, 2012. "Catching Growth Determinants with the Adaptive Lasso," German Economic Review, De Gruyter, vol. 13(1), pages 71-85, February.
    9. Jeon, Jong-June & Kwon, Sunghoon & Choi, Hosik, 2017. "Homogeneity detection for the high-dimensional generalized linear model," Computational Statistics & Data Analysis, Elsevier, vol. 114(C), pages 61-74.
    10. Xiang Zhang & Yichao Wu & Lan Wang & Runze Li, 2016. "Variable selection for support vector machines in moderately high dimensions," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 78(1), pages 53-76, January.
    11. Hansheng Wang & Bo Li & Chenlei Leng, 2009. "Shrinkage tuning parameter selection with a diverging number of parameters," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 71(3), pages 671-683, June.
    12. Kwon, Sunghoon & Choi, Hosik & Kim, Yongdai, 2011. "Quadratic approximation on SCAD penalized estimation," Computational Statistics & Data Analysis, Elsevier, vol. 55(1), pages 421-428, January.
    13. Kwon, Sunghoon & Oh, Seungyoung & Lee, Youngjo, 2016. "The use of random-effect models for high-dimensional variable selection problems," Computational Statistics & Data Analysis, Elsevier, vol. 103(C), pages 401-412.
    14. Lu, Xun & Su, Liangjun, 2016. "Shrinkage estimation of dynamic panel data models with interactive fixed effects," Journal of Econometrics, Elsevier, vol. 190(1), pages 148-175.
    15. Pötscher, Benedikt M. & Schneider, Ulrike, 2008. "Confidence sets based on penalized maximum likelihood estimators," MPRA Paper 9062, University Library of Munich, Germany.
    16. Lee, Eun Ryung & Park, Byeong U., 2012. "Sparse estimation in functional linear regression," Journal of Multivariate Analysis, Elsevier, vol. 105(1), pages 1-17.
    17. Yanxin Wang & Qibin Fan & Li Zhu, 2018. "Variable selection and estimation using a continuous approximation to the $$L_0$$ L 0 penalty," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 70(1), pages 191-214, February.
    18. Pötscher, Benedikt M. & Schneider, Ulrike, 2007. "On the distribution of the adaptive LASSO estimator," MPRA Paper 6913, University Library of Munich, Germany.
    19. Bruce E. Hansen, 2016. "The Risk of James--Stein and Lasso Shrinkage," Econometric Reviews, Taylor & Francis Journals, vol. 35(8-10), pages 1456-1470, December.
    20. Jie Ding & Vahid Tarokh & Yuhong Yang, 2018. "Model Selection Techniques -- An Overview," Papers 1810.09583, arXiv.org.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:csdana:v:92:y:2015:i:c:p:53-67. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/csda .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.