IDEAS home Printed from https://ideas.repec.org/a/eee/csdana/v56y2012i6p1952-1965.html
   My bibliography  Save this article

Weighted LAD-LASSO method for robust parameter estimation and variable selection in regression

Author

Listed:
  • Arslan, Olcay

Abstract

The weighted least absolute deviation (WLAD) regression estimation method and the adaptive least absolute shrinkage and selection operator (LASSO) are combined to achieve robust parameter estimation and variable selection in regression simultaneously. Compared with the LAD-LASSO method, the weighted LAD-LASSO (WLAD-LASSO) method will resist to the heavy-tailed errors and outliers in explanatory variables. Properties of the WLAD-LASSO estimators are investigated. A small simulation study and an example are provided to demonstrate the superiority of the WLAD-LASSO method over the LAD-LASSO method in the presence of outliers in the explanatory variables and the heavy-tailed error distribution.

Suggested Citation

  • Arslan, Olcay, 2012. "Weighted LAD-LASSO method for robust parameter estimation and variable selection in regression," Computational Statistics & Data Analysis, Elsevier, vol. 56(6), pages 1952-1965.
  • Handle: RePEc:eee:csdana:v:56:y:2012:i:6:p:1952-1965
    DOI: 10.1016/j.csda.2011.11.022
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0167947311004208
    Download Restriction: Full text for ScienceDirect subscribers only.

    File URL: https://libkey.io/10.1016/j.csda.2011.11.022?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Zou, Hui, 2006. "The Adaptive Lasso and Its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 101, pages 1418-1429, December.
    2. Hurvich, Clifford M. & Tsai, Chih-Ling, 1990. "Model selection for least absolute deviations regression in small samples," Statistics & Probability Letters, Elsevier, vol. 9(3), pages 259-265, March.
    3. Peide Shi & Chih‐Ling Tsai, 2004. "A Joint Regression Variable and Autoregressive Order Selection Criterion," Journal of Time Series Analysis, Wiley Blackwell, vol. 25(6), pages 923-941, November.
    4. Jinfeng Xu & Zhiliang Ying, 2010. "Simultaneous estimation and variable selection in median regression using Lasso-type penalty," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 62(3), pages 487-514, June.
    5. Fan J. & Li R., 2001. "Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 96, pages 1348-1360, December.
    6. Peide Shi & Chih‐Ling Tsai, 2002. "Regression model selection—a residual likelihood approach," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 64(2), pages 237-252, May.
    7. Wang, Hansheng & Li, Guodong & Jiang, Guohua, 2007. "Robust Regression Shrinkage and Consistent Variable Selection Through the LAD-Lasso," Journal of Business & Economic Statistics, American Statistical Association, vol. 25, pages 347-355, July.
    8. Giloni, Avi & Simonoff, Jeffrey S. & Sengupta, Bhaskar, 2006. "Robust weighted LAD regression," Computational Statistics & Data Analysis, Elsevier, vol. 50(11), pages 3124-3140, July.
    9. Wang, Hansheng & Leng, Chenlei, 2007. "Unified LASSO Estimation by Least Squares Approximation," Journal of the American Statistical Association, American Statistical Association, vol. 102, pages 1039-1048, September.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Umberto Amato & Anestis Antoniadis & Italia De Feis & Irene Gijbels, 2021. "Penalised robust estimators for sparse and high-dimensional linear models," Statistical Methods & Applications, Springer;Società Italiana di Statistica, vol. 30(1), pages 1-48, March.
    2. Mingqiu Wang & Guo-Liang Tian, 2016. "Robust group non-convex estimations for high-dimensional partially linear models," Journal of Nonparametric Statistics, Taylor & Francis Journals, vol. 28(1), pages 49-67, March.
    3. Vinciotti, Veronica & Hashem, Hussein, 2013. "Robust methods for inferring sparse network structures," Computational Statistics & Data Analysis, Elsevier, vol. 67(C), pages 84-94.
    4. Casado Yusta, Silvia & Nœ–ez Letamendía, Laura & Pacheco Bonrostro, Joaqu’n Antonio, 2018. "Predicting Corporate Failure: The GRASP-LOGIT Model || Predicci—n de la quiebra empresarial: el modelo GRASP-LOGIT," Revista de Métodos Cuantitativos para la Economía y la Empresa = Journal of Quantitative Methods for Economics and Business Administration, Universidad Pablo de Olavide, Department of Quantitative Methods for Economics and Business Administration, vol. 26(1), pages 294-314, Diciembre.
    5. Yeşim Güney & Yetkin Tuaç & Şenay Özdemir & Olcay Arslan, 2021. "Robust estimation and variable selection in heteroscedastic regression model using least favorable distribution," Computational Statistics, Springer, vol. 36(2), pages 805-827, June.
    6. Gijbels, I. & Vrinssen, I., 2015. "Robust nonnegative garrote variable selection in linear regression," Computational Statistics & Data Analysis, Elsevier, vol. 85(C), pages 1-22.
    7. Qiang Li & Liming Wang, 2020. "Robust change point detection method via adaptive LAD-LASSO," Statistical Papers, Springer, vol. 61(1), pages 109-121, February.
    8. Yafen Ye & Renyong Chi & Yuan-Hai Shao & Chun-Na Li & Xiangyu Hua, 2022. "Indicator Selection of Index Construction by Adaptive Lasso with a Generic $$\varepsilon $$ ε -Insensitive Loss," Computational Economics, Springer;Society for Computational Economics, vol. 60(3), pages 971-990, October.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Hao, Meiling & Lin, Yunyuan & Zhao, Xingqiu, 2016. "A relative error-based approach for variable selection," Computational Statistics & Data Analysis, Elsevier, vol. 103(C), pages 250-262.
    2. Umberto Amato & Anestis Antoniadis & Italia De Feis & Irene Gijbels, 2021. "Penalised robust estimators for sparse and high-dimensional linear models," Statistical Methods & Applications, Springer;Società Italiana di Statistica, vol. 30(1), pages 1-48, March.
    3. Z. John Daye & Jinbo Chen & Hongzhe Li, 2012. "High-Dimensional Heteroscedastic Regression with an Application to eQTL Data Analysis," Biometrics, The International Biometric Society, vol. 68(1), pages 316-326, March.
    4. Hansheng Wang & Bo Li & Chenlei Leng, 2009. "Shrinkage tuning parameter selection with a diverging number of parameters," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 71(3), pages 671-683, June.
    5. Bang, Sungwan & Jhun, Myoungshic, 2012. "Simultaneous estimation and factor selection in quantile regression via adaptive sup-norm regularization," Computational Statistics & Data Analysis, Elsevier, vol. 56(4), pages 813-826.
    6. Xia, Xiaochao & Liu, Zhi & Yang, Hu, 2016. "Regularized estimation for the least absolute relative error models with a diverging number of covariates," Computational Statistics & Data Analysis, Elsevier, vol. 96(C), pages 104-119.
    7. Qiang Li & Liming Wang, 2020. "Robust change point detection method via adaptive LAD-LASSO," Statistical Papers, Springer, vol. 61(1), pages 109-121, February.
    8. Diego Vidaurre & Concha Bielza & Pedro Larrañaga, 2013. "A Survey of L1 Regression," International Statistical Review, International Statistical Institute, vol. 81(3), pages 361-387, December.
    9. Fan, Rui & Lee, Ji Hyung & Shin, Youngki, 2023. "Predictive quantile regression with mixed roots and increasing dimensions: The ALQR approach," Journal of Econometrics, Elsevier, vol. 237(2).
    10. Hui Xiao & Yiguo Sun, 2020. "Forecasting the Returns of Cryptocurrency: A Model Averaging Approach," JRFM, MDPI, vol. 13(11), pages 1-15, November.
    11. Lenka Zbonakova & Wolfgang Karl Härdle & Weining Wang, 2016. "Time Varying Quantile Lasso," SFB 649 Discussion Papers SFB649DP2016-047, Sonderforschungsbereich 649, Humboldt University, Berlin, Germany.
    12. Weichi Wu & Zhou Zhou, 2017. "Nonparametric Inference for Time-Varying Coefficient Quantile Regression," Journal of Business & Economic Statistics, Taylor & Francis Journals, vol. 35(1), pages 98-109, January.
    13. Caner, Mehmet & Fan, Qingliang, 2015. "Hybrid generalized empirical likelihood estimators: Instrument selection with adaptive lasso," Journal of Econometrics, Elsevier, vol. 187(1), pages 256-274.
    14. Weihua Zhao & Riquan Zhang & Yazhao Lv & Jicai Liu, 2017. "Quantile regression and variable selection of single-index coefficient model," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 69(4), pages 761-789, August.
    15. Li, Xinjue & Zboňáková, Lenka & Wang, Weining & Härdle, Wolfgang Karl, 2019. "Combining Penalization and Adaption in High Dimension with Application in Bond Risk Premia Forecasting," IRTG 1792 Discussion Papers 2019-030, Humboldt University of Berlin, International Research Training Group 1792 "High Dimensional Nonstationary Time Series".
    16. Fei Jin & Lung-fei Lee, 2018. "Lasso Maximum Likelihood Estimation of Parametric Models with Singular Information Matrices," Econometrics, MDPI, vol. 6(1), pages 1-24, February.
    17. Yongjin Li & Qingzhao Zhang & Qihua Wang, 2017. "Penalized estimation equation for an extended single-index model," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 69(1), pages 169-187, February.
    18. Jiang, Rong & Qian, Weimin & Zhou, Zhangong, 2012. "Variable selection and coefficient estimation via composite quantile regression with randomly censored data," Statistics & Probability Letters, Elsevier, vol. 82(2), pages 308-317.
    19. Guang Cheng & Hao Zhang & Zuofeng Shang, 2015. "Sparse and efficient estimation for partial spline models with increasing dimension," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 67(1), pages 93-127, February.
    20. Sophie Lambert-Lacroix & Laurent Zwald, 2016. "The adaptive BerHu penalty in robust regression," Journal of Nonparametric Statistics, Taylor & Francis Journals, vol. 28(3), pages 487-514, September.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:csdana:v:56:y:2012:i:6:p:1952-1965. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/csda .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.