IDEAS home Printed from https://ideas.repec.org/a/spr/stpapr/v63y2022i1d10.1007_s00362-021-01229-0.html
   My bibliography  Save this article

Penalized and constrained LAD estimation in fixed and high dimension

Author

Listed:
  • Xiaofei Wu

    (Chongqing University)

  • Rongmei Liang

    (Chongqing University)

  • Hu Yang

    (Chongqing University)

Abstract

Recently, many literatures have proved that prior information and structure in many application fields can be formulated as constraints on regression coefficients. Following these work, we propose a $$L_1$$ L 1 penalized LAD estimation with some linear constraints in this paper. Different from constrained lasso, our estimation performs well when heavy-tailed errors or outliers are found in the response. In theory, we show that the proposed estimation enjoys the Oracle property with adjusted normal variance when the dimension of the estimated coefficients p is fixed. And when p is much greater than the sample size n, the error bound of proposed estimation is sharper than $$\sqrt{k\log (p)/n}$$ k log ( p ) / n . It is worth noting the result is true for a wide range of noise distribution, even for the Cauchy distribution. In algorithm, we not only consider an typical linear programming to solve proposed estimation in fixed dimension , but also present an nested alternating direction method of multipliers (ADMM) in high dimension. Simulation and application to real data also confirm that proposed estimation is an effective alternative when constrained lasso is unreliable.

Suggested Citation

  • Xiaofei Wu & Rongmei Liang & Hu Yang, 2022. "Penalized and constrained LAD estimation in fixed and high dimension," Statistical Papers, Springer, vol. 63(1), pages 53-95, February.
  • Handle: RePEc:spr:stpapr:v:63:y:2022:i:1:d:10.1007_s00362-021-01229-0
    DOI: 10.1007/s00362-021-01229-0
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s00362-021-01229-0
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s00362-021-01229-0?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Liqun Yu & Nan Lin, 2017. "ADMM for Penalized Quantile Regression in Big Data," International Statistical Review, International Statistical Institute, vol. 85(3), pages 494-518, December.
    2. Fan J. & Li R., 2001. "Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 96, pages 1348-1360, December.
    3. A. Belloni & V. Chernozhukov & L. Wang, 2011. "Square-root lasso: pivotal recovery of sparse signals via conic programming," Biometrika, Biometrika Trust, vol. 98(4), pages 791-806.
    4. Wang, J. D., 1995. "Asymptotic Normality of L1-Estimators in Nonlinear Regression," Journal of Multivariate Analysis, Elsevier, vol. 54(2), pages 227-238, August.
    5. Parker, Thomas, 2019. "Asymptotic inference for the constrained quantile regression process," Journal of Econometrics, Elsevier, vol. 213(1), pages 174-189.
    6. Wang, Hansheng & Li, Guodong & Jiang, Guohua, 2007. "Robust Regression Shrinkage and Consistent Variable Selection Through the LAD-Lasso," Journal of Business & Economic Statistics, American Statistical Association, vol. 25, pages 347-355, July.
    7. Gareth M. James & Courtney Paulson & Paat Rusmevichientong, 2020. "Penalized and Constrained Optimization: An Application to High-Dimensional Website Advertising," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 115(529), pages 107-122, January.
    8. Wu, Lan & Yang, Yuehan & Liu, Hanzhong, 2014. "Nonnegative-lasso and application in index tracking," Computational Statistics & Data Analysis, Elsevier, vol. 70(C), pages 116-126.
    9. Liu, Yongxin & Zeng, Peng & Lin, Lu, 2020. "Generalized ℓ1-penalized quantile regression with linear constraints," Computational Statistics & Data Analysis, Elsevier, vol. 142(C).
    10. Hu, Qinqin & Zeng, Peng & Lin, Lu, 2015. "The dual and degrees of freedom of linearly constrained generalized lasso," Computational Statistics & Data Analysis, Elsevier, vol. 86(C), pages 13-26.
    11. Pollard, David, 1991. "Asymptotics for Least Absolute Deviation Regression Estimators," Econometric Theory, Cambridge University Press, vol. 7(2), pages 186-199, June.
    12. Zou, Hui, 2006. "The Adaptive Lasso and Its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 101, pages 1418-1429, December.
    13. Yen, Yu-Min & Yen, Tso-Jung, 2014. "Solving norm constrained portfolio optimization via coordinate-wise descent algorithms," Computational Statistics & Data Analysis, Elsevier, vol. 76(C), pages 737-759.
    14. Wang, Lie, 2013. "The L1 penalized LAD estimator for high dimensional linear regression," Journal of Multivariate Analysis, Elsevier, vol. 120(C), pages 135-151.
    15. Robert Tibshirani & Michael Saunders & Saharon Rosset & Ji Zhu & Keith Knight, 2005. "Sparsity and smoothness via the fused lasso," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 67(1), pages 91-108, February.
    16. Wei Lin & Pixu Shi & Rui Feng & Hongzhe Li, 2014. "Variable selection in regression with compositional covariates," Biometrika, Biometrika Trust, vol. 101(4), pages 785-797.
    17. Mandal, B.N. & Ma, Jun, 2016. "l1 regularized multiplicative iterative path algorithm for non-negative generalized linear models," Computational Statistics & Data Analysis, Elsevier, vol. 101(C), pages 289-299.
    18. Hui Zou & Trevor Hastie, 2005. "Addendum: Regularization and variable selection via the elastic net," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 67(5), pages 768-768, November.
    19. Hui Zou & Trevor Hastie, 2005. "Regularization and variable selection via the elastic net," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 67(2), pages 301-320, April.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Umberto Amato & Anestis Antoniadis & Italia De Feis & Irene Gijbels, 2021. "Penalised robust estimators for sparse and high-dimensional linear models," Statistical Methods & Applications, Springer;Società Italiana di Statistica, vol. 30(1), pages 1-48, March.
    2. Shanshan Qin & Hao Ding & Yuehua Wu & Feng Liu, 2021. "High-dimensional sign-constrained feature selection and grouping," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 73(4), pages 787-819, August.
    3. Jiang, Liewen & Bondell, Howard D. & Wang, Huixia Judy, 2014. "Interquantile shrinkage and variable selection in quantile regression," Computational Statistics & Data Analysis, Elsevier, vol. 69(C), pages 208-219.
    4. Ismail Shah & Hina Naz & Sajid Ali & Amani Almohaimeed & Showkat Ahmad Lone, 2023. "A New Quantile-Based Approach for LASSO Estimation," Mathematics, MDPI, vol. 11(6), pages 1-13, March.
    5. Peng Zeng & Qinqin Hu & Xiaoyu Li, 2017. "Geometry and Degrees of Freedom of Linearly Constrained Generalized Lasso," Scandinavian Journal of Statistics, Danish Society for Theoretical Statistics;Finnish Statistical Society;Norwegian Statistical Association;Swedish Statistical Association, vol. 44(4), pages 989-1008, December.
    6. Diego Vidaurre & Concha Bielza & Pedro Larrañaga, 2013. "A Survey of L1 Regression," International Statistical Review, International Statistical Institute, vol. 81(3), pages 361-387, December.
    7. Margherita Giuzio & Sandra Paterlini, 2019. "Un-diversifying during crises: Is it a good idea?," Computational Management Science, Springer, vol. 16(3), pages 401-432, July.
    8. Rafael Blanquero & Emilio Carrizosa & Pepa Ramírez-Cobo & M. Remedios Sillero-Denamiel, 2021. "A cost-sensitive constrained Lasso," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 15(1), pages 121-158, March.
    9. Tutz, Gerhard & Pößnecker, Wolfgang & Uhlmann, Lorenz, 2015. "Variable selection in general multinomial logit models," Computational Statistics & Data Analysis, Elsevier, vol. 82(C), pages 207-222.
    10. Margherita Giuzio, 2017. "Genetic algorithm versus classical methods in sparse index tracking," Decisions in Economics and Finance, Springer;Associazione per la Matematica, vol. 40(1), pages 243-256, November.
    11. Yize Zhao & Matthias Chung & Brent A. Johnson & Carlos S. Moreno & Qi Long, 2016. "Hierarchical Feature Selection Incorporating Known and Novel Biological Information: Identifying Genomic Features Related to Prostate Cancer Recurrence," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 111(516), pages 1427-1439, October.
    12. Loann David Denis Desboulets, 2018. "A Review on Variable Selection in Regression Analysis," Econometrics, MDPI, vol. 6(4), pages 1-27, November.
    13. Lee, Ji Hyung & Shi, Zhentao & Gao, Zhan, 2022. "On LASSO for predictive regression," Journal of Econometrics, Elsevier, vol. 229(2), pages 322-349.
    14. Victor Chernozhukov & Christian Hansen & Yuan Liao, 2015. "A lava attack on the recovery of sums of dense and sparse signals," CeMMAP working papers CWP56/15, Centre for Microdata Methods and Practice, Institute for Fiscal Studies.
    15. Takumi Saegusa & Tianzhou Ma & Gang Li & Ying Qing Chen & Mei-Ling Ting Lee, 2020. "Variable Selection in Threshold Regression Model with Applications to HIV Drug Adherence Data," Statistics in Biosciences, Springer;International Chinese Statistical Association, vol. 12(3), pages 376-398, December.
    16. Zanhua Yin, 2020. "Variable selection for sparse logistic regression," Metrika: International Journal for Theoretical and Applied Statistics, Springer, vol. 83(7), pages 821-836, October.
    17. Benjamin Poignard, 2020. "Asymptotic theory of the adaptive Sparse Group Lasso," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 72(1), pages 297-328, February.
    18. Florian Ziel, 2015. "Iteratively reweighted adaptive lasso for conditional heteroscedastic time series with applications to AR-ARCH type processes," Papers 1502.06557, arXiv.org, revised Dec 2015.
    19. Massimiliano Caporin & Francesco Poli, 2017. "Building News Measures from Textual Data and an Application to Volatility Forecasting," Econometrics, MDPI, vol. 5(3), pages 1-46, August.
    20. Pei Wang & Shunjie Chen & Sijia Yang, 2022. "Recent Advances on Penalized Regression Models for Biological Data," Mathematics, MDPI, vol. 10(19), pages 1-24, October.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:stpapr:v:63:y:2022:i:1:d:10.1007_s00362-021-01229-0. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.