IDEAS home Printed from https://ideas.repec.org/p/arx/papers/2309.01764.html
   My bibliography  Save this paper

Generalized Information Criteria for Structured Sparse Models

Author

Listed:
  • Eduardo F. Mendes
  • Gabriel J. P. Pinto

Abstract

Regularized m-estimators are widely used due to their ability of recovering a low-dimensional model in high-dimensional scenarios. Some recent efforts on this subject focused on creating a unified framework for establishing oracle bounds, and deriving conditions for support recovery. Under this same framework, we propose a new Generalized Information Criteria (GIC) that takes into consideration the sparsity pattern one wishes to recover. We obtain non-asymptotic model selection bounds and sufficient conditions for model selection consistency of the GIC. Furthermore, we show that the GIC can also be used for selecting the regularization parameter within a regularized $m$-estimation framework, which allows practical use of the GIC for model selection in high-dimensional scenarios. We provide examples of group LASSO in the context of generalized linear regression and low rank matrix regression.

Suggested Citation

  • Eduardo F. Mendes & Gabriel J. P. Pinto, 2023. "Generalized Information Criteria for Structured Sparse Models," Papers 2309.01764, arXiv.org.
  • Handle: RePEc:arx:papers:2309.01764
    as

    Download full text from publisher

    File URL: http://arxiv.org/pdf/2309.01764
    File Function: Latest version
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Ai Ni & Jianwen Cai, 2018. "Tuning Parameter Selection in Cox Proportional Hazards Model with a Diverging Number of Parameters," Scandinavian Journal of Statistics, Danish Society for Theoretical Statistics;Finnish Statistical Society;Norwegian Statistical Association;Swedish Statistical Association, vol. 45(3), pages 557-570, September.
    2. Fan J. & Li R., 2001. "Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 96, pages 1348-1360, December.
    3. Hansheng Wang & Bo Li & Chenlei Leng, 2009. "Shrinkage tuning parameter selection with a diverging number of parameters," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 71(3), pages 671-683, June.
    4. Hansheng Wang & Guodong Li & Chih‐Ling Tsai, 2007. "Regression coefficient and autoregressive order shrinkage and selection via the lasso," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 69(1), pages 63-78, February.
    5. Gao, Xin & Song, Peter X.-K., 2010. "Composite Likelihood Bayesian Information Criteria for Model Selection in High-Dimensional Data," Journal of the American Statistical Association, American Statistical Association, vol. 105(492), pages 1531-1540.
    6. Karl W. Broman & Terence P. Speed, 2002. "A model selection approach for the identification of quantitative trait loci in experimental crosses," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 64(4), pages 641-656, October.
    7. Zou, Hui, 2006. "The Adaptive Lasso and Its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 101, pages 1418-1429, December.
    8. Zhang, Yiyun & Li, Runze & Tsai, Chih-Ling, 2010. "Regularization Parameter Selections via Generalized Information Criterion," Journal of the American Statistical Association, American Statistical Association, vol. 105(489), pages 312-323.
    9. Ming Yuan & Yi Lin, 2007. "Model selection and estimation in the Gaussian graphical model," Biometrika, Biometrika Trust, vol. 94(1), pages 19-35.
    10. Eun Ryung Lee & Hohsuk Noh & Byeong U. Park, 2014. "Model Selection via Bayesian Information Criterion for Quantile Regression Models," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 109(505), pages 216-229, March.
    11. Ming Yuan & Yi Lin, 2006. "Model selection and estimation in regression with grouped variables," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 68(1), pages 49-67, February.
    12. Yiyuan She & Hoang Tran, 2019. "On cross‐validation for sparse reduced rank regression," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 81(1), pages 145-161, February.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Lenka Zbonakova & Wolfgang Karl Härdle & Weining Wang, 2016. "Time Varying Quantile Lasso," SFB 649 Discussion Papers SFB649DP2016-047, Sonderforschungsbereich 649, Humboldt University, Berlin, Germany.
    2. Fei Jin & Lung-fei Lee, 2018. "Lasso Maximum Likelihood Estimation of Parametric Models with Singular Information Matrices," Econometrics, MDPI, vol. 6(1), pages 1-24, February.
    3. Jin, Fei & Lee, Lung-fei, 2018. "Irregular N2SLS and LASSO estimation of the matrix exponential spatial specification model," Journal of Econometrics, Elsevier, vol. 206(2), pages 336-358.
    4. Daniel, Jeffrey & Horrocks, Julie & Umphrey, Gary J., 2018. "Penalized composite likelihoods for inhomogeneous Gibbs point process models," Computational Statistics & Data Analysis, Elsevier, vol. 124(C), pages 104-116.
    5. Zbonakova, L. & Härdle, W.K. & Wang, W., 2016. "Time Varying Quantile Lasso," Working Papers 16/07, Department of Economics, City University London.
    6. Kaixu Yang & Tapabrata Maiti, 2022. "Ultrahigh‐dimensional generalized additive model: Unified theory and methods," Scandinavian Journal of Statistics, Danish Society for Theoretical Statistics;Finnish Statistical Society;Norwegian Statistical Association;Swedish Statistical Association, vol. 49(3), pages 917-942, September.
    7. Cai, Zongwu & Juhl, Ted & Yang, Bingduo, 2015. "Functional index coefficient models with variable selection," Journal of Econometrics, Elsevier, vol. 189(2), pages 272-284.
    8. Lam, Clifford, 2008. "Estimation of large precision matrices through block penalization," LSE Research Online Documents on Economics 31543, London School of Economics and Political Science, LSE Library.
    9. Camila Epprecht & Dominique Guegan & Álvaro Veiga & Joel Correa da Rosa, 2017. "Variable selection and forecasting via automated methods for linear models: LASSO/adaLASSO and Autometrics," Post-Print halshs-00917797, HAL.
    10. Medeiros, Marcelo C. & Mendes, Eduardo F., 2016. "ℓ1-regularization of high-dimensional time-series models with non-Gaussian and heteroskedastic errors," Journal of Econometrics, Elsevier, vol. 191(1), pages 255-271.
    11. Camila Epprecht & Dominique Guegan & Álvaro Veiga & Joel Correa da Rosa, 2017. "Variable selection and forecasting via automated methods for linear models: LASSO/adaLASSO and Autometrics," Université Paris1 Panthéon-Sorbonne (Post-Print and Working Papers) halshs-00917797, HAL.
    12. Marcelo C. Medeiros & Eduardo F. Mendes, 2017. "Adaptive LASSO estimation for ARDL models with GARCH innovations," Econometric Reviews, Taylor & Francis Journals, vol. 36(6-9), pages 622-637, October.
    13. Ricardo P. Masini & Marcelo C. Medeiros & Eduardo F. Mendes, 2023. "Machine learning advances for time series forecasting," Journal of Economic Surveys, Wiley Blackwell, vol. 37(1), pages 76-111, February.
    14. Lichun Wang & Yuan You & Heng Lian, 2015. "Convergence and sparsity of Lasso and group Lasso in high-dimensional generalized linear models," Statistical Papers, Springer, vol. 56(3), pages 819-828, August.
    15. Pei Wang & Shunjie Chen & Sijia Yang, 2022. "Recent Advances on Penalized Regression Models for Biological Data," Mathematics, MDPI, vol. 10(19), pages 1-24, October.
    16. Zhang, Shucong & Zhou, Yong, 2018. "Variable screening for ultrahigh dimensional heterogeneous data via conditional quantile correlations," Journal of Multivariate Analysis, Elsevier, vol. 165(C), pages 1-13.
    17. Ping Zeng & Yongyue Wei & Yang Zhao & Jin Liu & Liya Liu & Ruyang Zhang & Jianwei Gou & Shuiping Huang & Feng Chen, 2014. "Variable selection approach for zero-inflated count data via adaptive lasso," Journal of Applied Statistics, Taylor & Francis Journals, vol. 41(4), pages 879-894, April.
    18. Wang, Hansheng & Leng, Chenlei, 2008. "A note on adaptive group lasso," Computational Statistics & Data Analysis, Elsevier, vol. 52(12), pages 5277-5286, August.
    19. Heng Lian, 2012. "Variable selection in high-dimensional partly linear additive models," Journal of Nonparametric Statistics, Taylor & Francis Journals, vol. 24(4), pages 825-839, December.
    20. Mehrabani, Ali, 2023. "Estimation and identification of latent group structures in panel data," Journal of Econometrics, Elsevier, vol. 235(2), pages 1464-1482.

    More about this item

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:arx:papers:2309.01764. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: arXiv administrators (email available below). General contact details of provider: http://arxiv.org/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.