IDEAS home Printed from https://ideas.repec.org/p/crs/wpaper/2014-05.html
   My bibliography  Save this paper

On the Prediction Performance of the Lasso

Author

Listed:
  • Arnak S. Dalalyan

    (CREST-ENSAE)

  • Mohamed Hebiri

    (Université Paris Est)

  • Johannes Lederer

    (Cornell University)

Abstract

Although the Lasso has been extensively studied, the relationship between its prediction performance and the correlations of the covariates is not fully understood. In this paper, we give new insights into this relationship in the context of multiple linear regression. We show, in particular, that the incorporation of a simple correlation measure into the tuning parameter leads to a nearly optimal prediction performance of the Lasso even for highly correlated covariates. However, we also reveal that for moderately correlated covariates, the prediction performance of the Lasso can be mediocre irrespective of the choice of the tuning parameter. For the illustration of our approach with an important application, we deduce nearly optimal rates for the least-squares estimator with total variation penalty

Suggested Citation

  • Arnak S. Dalalyan & Mohamed Hebiri & Johannes Lederer, 2014. "On the Prediction Performance of the Lasso," Working Papers 2014-05, Center for Research in Economics and Statistics.
  • Handle: RePEc:crs:wpaper:2014-05
    as

    Download full text from publisher

    File URL: http://crest.science/RePEc/wpstorage/2014-05.pdf
    File Function: Crest working paper version
    Download Restriction: no
    ---><---

    Other versions of this item:

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Pawan Gupta & Marianna Pensky, 2018. "Solution of Linear Ill-Posed Problems Using Random Dictionaries," Sankhya B: The Indian Journal of Statistics, Springer;Indian Statistical Institute, vol. 80(1), pages 178-193, May.
    2. Sheng Xu & Zhou Fan, 2021. "Iterative Alpha Expansion for estimating gradient‐sparse signals from linear measurements," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 83(2), pages 271-292, April.
    3. Alexandre Belloni & Mingli Chen & Oscar Hernan Madrid Padilla & Zixuan & Wang, 2019. "High Dimensional Latent Panel Quantile Regression with an Application to Asset Pricing," Papers 1912.02151, arXiv.org, revised Aug 2022.
    4. Pierre Bellec & Alexandre Tsybakov, 2015. "Sharp oracle bounds for monotone and convex regression through aggregation," Working Papers 2015-04, Center for Research in Economics and Statistics.
    5. Wanling Xie & Hu Yang, 2023. "Group sparse recovery via group square-root elastic net and the iterative multivariate thresholding-based algorithm," AStA Advances in Statistical Analysis, Springer;German Statistical Society, vol. 107(3), pages 469-507, September.
    6. Tanin Sirimongkolkasem & Reza Drikvandi, 2019. "On Regularisation Methods for Analysis of High Dimensional Data," Annals of Data Science, Springer, vol. 6(4), pages 737-763, December.
    7. Jacob Bien & Irina Gaynanova & Johannes Lederer & Christian L. Müller, 2019. "Prediction error bounds for linear regression with the TREX," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 28(2), pages 451-474, June.
    8. Gold, David & Lederer, Johannes & Tao, Jing, 2020. "Inference for high-dimensional instrumental variables regression," Journal of Econometrics, Elsevier, vol. 217(1), pages 79-111.
    9. Laura Freijeiro‐González & Manuel Febrero‐Bande & Wenceslao González‐Manteiga, 2022. "A Critical Review of LASSO and Its Derivatives for Variable Selection Under Dependence Among Covariates," International Statistical Review, International Statistical Institute, vol. 90(1), pages 118-145, April.
    10. Tung Duy Luu & Jalal Fadili & Christophe Chesneau, 2020. "Sharp oracle inequalities for low-complexity priors," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 72(2), pages 353-397, April.

    More about this item

    Keywords

    multiple linear regression; sparse recovery; total variation penalty; oracle inequalities;
    All these keywords.

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:crs:wpaper:2014-05. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Secretariat General (email available below). General contact details of provider: https://edirc.repec.org/data/crestfr.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.