IDEAS home Printed from https://ideas.repec.org/a/spr/aistmt/v65y2013i2p295-318.html
   My bibliography  Save this article

Model selection via standard error adjusted adaptive lasso

Author

Listed:
  • Wei Qian
  • Yuhong Yang

Abstract

The adaptive lasso is a model selection method shown to be both consistent in variable selection and asymptotically normal in coefficient estimation. The actual variable selection performance of the adaptive lasso depends on the weight used. It turns out that the weight assignment using the OLS estimate (OLS-adaptive lasso) can result in very poor performance when collinearity of the model matrix is a concern. To achieve better variable selection results, we take into account the standard errors of the OLS estimate for weight calculation, and propose two different versions of the adaptive lasso denoted by SEA-lasso and NSEA-lasso. We show through numerical studies that when the predictors are highly correlated, SEA-lasso and NSEA-lasso can outperform OLS-adaptive lasso under a variety of linear regression settings while maintaining the same theoretical properties of the adaptive lasso. Copyright The Institute of Statistical Mathematics, Tokyo 2013

Suggested Citation

  • Wei Qian & Yuhong Yang, 2013. "Model selection via standard error adjusted adaptive lasso," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 65(2), pages 295-318, April.
  • Handle: RePEc:spr:aistmt:v:65:y:2013:i:2:p:295-318
    DOI: 10.1007/s10463-012-0370-0
    as

    Download full text from publisher

    File URL: http://hdl.handle.net/10.1007/s10463-012-0370-0
    Download Restriction: Access to full text is restricted to subscribers.

    File URL: https://libkey.io/10.1007/s10463-012-0370-0?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Fan J. & Li R., 2001. "Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 96, pages 1348-1360, December.
    2. Hansheng Wang & Runze Li & Chih-Ling Tsai, 2007. "Tuning parameter selectors for the smoothly clipped absolute deviation method," Biometrika, Biometrika Trust, vol. 94(3), pages 553-568.
    3. Zou, Hui, 2006. "The Adaptive Lasso and Its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 101, pages 1418-1429, December.
    4. Friedman, Jerome H. & Hastie, Trevor & Tibshirani, Rob, 2010. "Regularization Paths for Generalized Linear Models via Coordinate Descent," Journal of Statistical Software, Foundation for Open Access Statistics, vol. 33(i01).
    5. Harrison, David Jr. & Rubinfeld, Daniel L., 1978. "Hedonic housing prices and the demand for clean air," Journal of Environmental Economics and Management, Elsevier, vol. 5(1), pages 81-102, March.
    6. Wang, Hansheng & Leng, Chenlei, 2007. "Unified LASSO Estimation by Least Squares Approximation," Journal of the American Statistical Association, American Statistical Association, vol. 102, pages 1039-1048, September.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Krüger, Jens J. & Rhiel, Mathias, 2016. "Determinants of ICT infrastructure: A cross-country statistical analysis," Darmstadt Discussion Papers in Economics 228, Darmstadt University of Technology, Department of Law and Economics.
    2. Zakariya Algamal & Muhammad Lee, 2015. "Adjusted Adaptive LASSO in High-dimensional Poisson Regression Model," Modern Applied Science, Canadian Center of Science and Education, vol. 9(4), pages 170-170, April.
    3. Zakariya Yahya Algamal & Muhammad Hisyam Lee, 2019. "A two-stage sparse logistic regression for optimal gene selection in high-dimensional microarray data classification," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 13(3), pages 753-771, September.
    4. De La Maza, Cristóbal & Davis, Alex & Azevedo, Inês, 2021. "Welfare analysis of the ecological impacts of electricity production in Chile using the sparse multinomial logit model," Ecological Economics, Elsevier, vol. 184(C).
    5. Auer, Benjamin R. & Schuhmacher, Frank & Niemann, Sebastian, 2023. "Cloning mutual fund returns," The Quarterly Review of Economics and Finance, Elsevier, vol. 90(C), pages 31-37.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Wei Wang & Shou‐En Lu & Jerry Q. Cheng & Minge Xie & John B. Kostis, 2022. "Multivariate survival analysis in big data: A divide‐and‐combine approach," Biometrics, The International Biometric Society, vol. 78(3), pages 852-866, September.
    2. Yanxin Wang & Qibin Fan & Li Zhu, 2018. "Variable selection and estimation using a continuous approximation to the $$L_0$$ L 0 penalty," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 70(1), pages 191-214, February.
    3. Umberto Amato & Anestis Antoniadis & Italia De Feis & Irene Gijbels, 2021. "Penalised robust estimators for sparse and high-dimensional linear models," Statistical Methods & Applications, Springer;Società Italiana di Statistica, vol. 30(1), pages 1-48, March.
    4. Tizheng Li & Xiaojuan Kang, 2022. "Variable selection of higher-order partially linear spatial autoregressive model with a diverging number of parameters," Statistical Papers, Springer, vol. 63(1), pages 243-285, February.
    5. Li, Xinjue & Zboňáková, Lenka & Wang, Weining & Härdle, Wolfgang Karl, 2019. "Combining Penalization and Adaption in High Dimension with Application in Bond Risk Premia Forecasting," IRTG 1792 Discussion Papers 2019-030, Humboldt University of Berlin, International Research Training Group 1792 "High Dimensional Nonstationary Time Series".
    6. Fei Jin & Lung-fei Lee, 2018. "Lasso Maximum Likelihood Estimation of Parametric Models with Singular Information Matrices," Econometrics, MDPI, vol. 6(1), pages 1-24, February.
    7. Zhixuan Fu & Shuangge Ma & Haiqun Lin & Chirag R. Parikh & Bingqing Zhou, 2017. "Penalized Variable Selection for Multi-center Competing Risks Data," Statistics in Biosciences, Springer;International Chinese Statistical Association, vol. 9(2), pages 379-405, December.
    8. Ramon I. Garcia & Joseph G. Ibrahim & Hongtu Zhu, 2010. "Variable Selection in the Cox Regression Model with Covariates Missing at Random," Biometrics, The International Biometric Society, vol. 66(1), pages 97-104, March.
    9. Xingwei Tong & Xin He & Liuquan Sun & Jianguo Sun, 2009. "Variable Selection for Panel Count Data via Non‐Concave Penalized Estimating Function," Scandinavian Journal of Statistics, Danish Society for Theoretical Statistics;Finnish Statistical Society;Norwegian Statistical Association;Swedish Statistical Association, vol. 36(4), pages 620-635, December.
    10. Jin, Fei & Lee, Lung-fei, 2018. "Irregular N2SLS and LASSO estimation of the matrix exponential spatial specification model," Journal of Econometrics, Elsevier, vol. 206(2), pages 336-358.
    11. Ping Zeng & Yongyue Wei & Yang Zhao & Jin Liu & Liya Liu & Ruyang Zhang & Jianwei Gou & Shuiping Huang & Feng Chen, 2014. "Variable selection approach for zero-inflated count data via adaptive lasso," Journal of Applied Statistics, Taylor & Francis Journals, vol. 41(4), pages 879-894, April.
    12. Xia, Xiaochao & Liu, Zhi & Yang, Hu, 2016. "Regularized estimation for the least absolute relative error models with a diverging number of covariates," Computational Statistics & Data Analysis, Elsevier, vol. 96(C), pages 104-119.
    13. Xuan Liu & Jianbao Chen, 2021. "Variable Selection for the Spatial Autoregressive Model with Autoregressive Disturbances," Mathematics, MDPI, vol. 9(12), pages 1-20, June.
    14. Kwon, Sunghoon & Choi, Hosik & Kim, Yongdai, 2011. "Quadratic approximation on SCAD penalized estimation," Computational Statistics & Data Analysis, Elsevier, vol. 55(1), pages 421-428, January.
    15. Zhang Haixiang & Zheng Yinan & Zhang Zhou & Gao Tao & Joyce Brian & Zhang Wei & Hou Lifang & Liu Lei & Yoon Grace & Schwartz Joel & Vokonas Pantel & Colicino Elena & Baccarelli Andrea, 2017. "Regularized estimation in sparse high-dimensional multivariate regression, with application to a DNA methylation study," Statistical Applications in Genetics and Molecular Biology, De Gruyter, vol. 16(3), pages 159-171, August.
    16. Hirose, Kei & Tateishi, Shohei & Konishi, Sadanori, 2013. "Tuning parameter selection in sparse regression modeling," Computational Statistics & Data Analysis, Elsevier, vol. 59(C), pages 28-40.
    17. Li-Ping Zhu & Lin-Yi Qian & Jin-Guan Lin, 2011. "Variable selection in a class of single-index models," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 63(6), pages 1277-1293, December.
    18. Matsui, Hidetoshi, 2014. "Variable and boundary selection for functional data via multiclass logistic regression modeling," Computational Statistics & Data Analysis, Elsevier, vol. 78(C), pages 176-185.
    19. Lee, Eun Ryung & Park, Byeong U., 2012. "Sparse estimation in functional linear regression," Journal of Multivariate Analysis, Elsevier, vol. 105(1), pages 1-17.
    20. Hao, Meiling & Lin, Yunyuan & Zhao, Xingqiu, 2016. "A relative error-based approach for variable selection," Computational Statistics & Data Analysis, Elsevier, vol. 103(C), pages 250-262.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:aistmt:v:65:y:2013:i:2:p:295-318. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.