IDEAS home Printed from https://ideas.repec.org/a/bla/jorssb/v82y2020i3p685-718.html
   My bibliography  Save this article

A flexible framework for hypothesis testing in high dimensions

Author

Listed:
  • Adel Javanmard
  • Jason D. Lee

Abstract

Hypothesis testing in the linear regression model is a fundamental statistical problem. We consider linear regression in the high dimensional regime where the number of parameters exceeds the number of samples (p>n). To make informative inference, we assume that the model is approximately sparse, i.e. the effect of covariates on the response can be well approximated by conditioning on a relatively small number of covariates whose identities are unknown. We develop a framework for testing very general hypotheses regarding the model parameters. Our framework encompasses testing whether the parameter lies in a convex cone, testing the signal strength, and testing arbitrary functionals of the parameter. We show that the procedure proposed controls the type I error, and we also analyse the power of the procedure. Our numerical experiments confirm our theoretical findings and demonstrate that we control the false positive rate (type I error) near the nominal level and have high power. By duality between hypotheses testing and confidence intervals, the framework proposed can be used to obtain valid confidence intervals for various functionals of the model parameters. For linear functionals, the length of confidence intervals is shown to be minimax rate optimal.

Suggested Citation

  • Adel Javanmard & Jason D. Lee, 2020. "A flexible framework for hypothesis testing in high dimensions," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 82(3), pages 685-718, July.
  • Handle: RePEc:bla:jorssb:v:82:y:2020:i:3:p:685-718
    DOI: 10.1111/rssb.12373
    as

    Download full text from publisher

    File URL: https://doi.org/10.1111/rssb.12373
    Download Restriction: no

    File URL: https://libkey.io/10.1111/rssb.12373?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Jianqing Fan & Shaojun Guo & Ning Hao, 2012. "Variance estimation using refitted cross‐validation in ultrahigh dimensional regression," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 74(1), pages 37-65, January.
    2. Friedman, Jerome H. & Hastie, Trevor & Tibshirani, Rob, 2010. "Regularization Paths for Generalized Linear Models via Coordinate Descent," Journal of Statistical Software, Foundation for Open Access Statistics, vol. 33(i01).
    3. A. Belloni & D. Chen & V. Chernozhukov & C. Hansen, 2012. "Sparse Models and Methods for Optimal Instruments With an Application to Eminent Domain," Econometrica, Econometric Society, vol. 80(6), pages 2369-2429, November.
    4. Alexandre Belloni & Victor Chernozhukov & Christian Hansen, 2010. "LASSO Methods for Gaussian Instrumental Variables Models," Papers 1012.1297, arXiv.org, revised Feb 2011.
    5. Alexandre Belloni & Victor Chernozhukov & Christian Hansen, 2014. "Inference on Treatment Effects after Selection among High-Dimensional Controlsâ€," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 81(2), pages 608-650.
    6. Mengjie Chen & Zhao Ren & Hongyu Zhao & Harrison Zhou, 2016. "Asymptotically Normal and Efficient Estimation of Covariate-Adjusted Gaussian Graphical Model," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 111(513), pages 394-406, March.
    7. Emmanuel Candès & Yingying Fan & Lucas Janson & Jinchi Lv, 2018. "Panning for gold: ‘model‐X’ knockoffs for high dimensional controlled variable selection," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 80(3), pages 551-577, June.
    8. Fan J. & Li R., 2001. "Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 96, pages 1348-1360, December.
    9. Wang, Yining & Wang, Jialei & Balakrishnan, Sivaraman & Singh, Aarti, 2019. "Rate optimal estimation and confidence intervals for high-dimensional regression with missing covariates," Journal of Multivariate Analysis, Elsevier, vol. 174(C).
    10. Tingni Sun & Cun-Hui Zhang, 2012. "Scaled sparse linear regression," Biometrika, Biometrika Trust, vol. 99(4), pages 879-898.
    11. A. Belloni & V. Chernozhukov & I. Fernández‐Val & C. Hansen, 2017. "Program Evaluation and Causal Inference With High‐Dimensional Data," Econometrica, Econometric Society, vol. 85, pages 233-298, January.
    12. Jianqing Fan & Jinchi Lv, 2008. "Sure independence screening for ultrahigh dimensional feature space," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 70(5), pages 849-911, November.
    13. Zijian Guo & Wanjie Wang & T. Tony Cai & Hongzhe Li, 2019. "Optimal Estimation of Genetic Relatedness in High-Dimensional Linear Models," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 114(525), pages 358-369, January.
    14. Lucas Janson & Rina Foygel Barber & Emmanuel Candès, 2017. "EigenPrism: inference for high dimensional signal-to-noise ratios," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 79(4), pages 1037-1065, September.
    15. Cun-Hui Zhang & Stephanie S. Zhang, 2014. "Confidence intervals for low dimensional parameters in high dimensional linear models," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 76(1), pages 217-242, January.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Shengfei Tang & Yanmei Shi & Qi Zhang, 2023. "Bias-Corrected Inference of High-Dimensional Generalized Linear Models," Mathematics, MDPI, vol. 11(4), pages 1-14, February.
    2. Tianxi Cai & T. Tony Cai & Zijian Guo, 2021. "Optimal statistical inference for individualized treatment effects in high‐dimensional models," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 83(4), pages 669-719, September.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Zemin Zheng & Jinchi Lv & Wei Lin, 2021. "Nonsparse Learning with Latent Variables," Operations Research, INFORMS, vol. 69(1), pages 346-359, January.
    2. Alexandre Belloni & Victor Chernozhukov & Denis Chetverikov & Christian Hansen & Kengo Kato, 2018. "High-dimensional econometrics and regularized GMM," CeMMAP working papers CWP35/18, Centre for Microdata Methods and Practice, Institute for Fiscal Studies.
    3. Zemin Zheng & Jie Zhang & Yang Li, 2022. "L 0 -Regularized Learning for High-Dimensional Additive Hazards Regression," INFORMS Journal on Computing, INFORMS, vol. 34(5), pages 2762-2775, September.
    4. Philipp Bach & Victor Chernozhukov & Malte S. Kurz & Martin Spindler & Sven Klaassen, 2021. "DoubleML -- An Object-Oriented Implementation of Double Machine Learning in R," Papers 2103.09603, arXiv.org, revised Feb 2024.
    5. Agboola, Oluwagbenga David & Yu, Han, 2023. "Neighborhood-based cross fitting approach to treatment effects with high-dimensional data," Computational Statistics & Data Analysis, Elsevier, vol. 186(C).
    6. Adamek, Robert & Smeekes, Stephan & Wilms, Ines, 2023. "Lasso inference for high-dimensional time series," Journal of Econometrics, Elsevier, vol. 235(2), pages 1114-1143.
    7. Lan, Wei & Zhong, Ping-Shou & Li, Runze & Wang, Hansheng & Tsai, Chih-Ling, 2016. "Testing a single regression coefficient in high dimensional linear models," Journal of Econometrics, Elsevier, vol. 195(1), pages 154-168.
    8. Breunig, Christoph & Mammen, Enno & Simoni, Anna, 2020. "Ill-posed estimation in high-dimensional models with instrumental variables," Journal of Econometrics, Elsevier, vol. 219(1), pages 171-200.
    9. Tianxi Cai & T. Tony Cai & Zijian Guo, 2021. "Optimal statistical inference for individualized treatment effects in high‐dimensional models," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 83(4), pages 669-719, September.
    10. Caner, Mehmet & Kock, Anders Bredahl, 2018. "Asymptotically honest confidence regions for high dimensional parameters by the desparsified conservative Lasso," Journal of Econometrics, Elsevier, vol. 203(1), pages 143-168.
    11. Xin Wang & Lingchen Kong & Liqun Wang, 2022. "Estimation of Error Variance in Regularized Regression Models via Adaptive Lasso," Mathematics, MDPI, vol. 10(11), pages 1-19, June.
    12. Guo, Xu & Li, Runze & Liu, Jingyuan & Zeng, Mudong, 2023. "Statistical inference for linear mediation models with high-dimensional mediators and application to studying stock reaction to COVID-19 pandemic," Journal of Econometrics, Elsevier, vol. 235(1), pages 166-179.
    13. Bryan T. Kelly & Asaf Manela & Alan Moreira, 2019. "Text Selection," NBER Working Papers 26517, National Bureau of Economic Research, Inc.
    14. Peter Bühlmann & Jacopo Mandozzi, 2014. "High-dimensional variable screening and bias in subsequent inference, with an empirical comparison," Computational Statistics, Springer, vol. 29(3), pages 407-430, June.
    15. Hansen, Christian & Liao, Yuan, 2019. "The Factor-Lasso And K-Step Bootstrap Approach For Inference In High-Dimensional Economic Applications," Econometric Theory, Cambridge University Press, vol. 35(3), pages 465-509, June.
    16. Lee, Ji Hyung & Shi, Zhentao & Gao, Zhan, 2022. "On LASSO for predictive regression," Journal of Econometrics, Elsevier, vol. 229(2), pages 322-349.
    17. Jingxuan Luo & Lili Yue & Gaorong Li, 2023. "Overview of High-Dimensional Measurement Error Regression Models," Mathematics, MDPI, vol. 11(14), pages 1-22, July.
    18. The Tien Mai, 2023. "Reliable Genetic Correlation Estimation via Multiple Sample Splitting and Smoothing," Mathematics, MDPI, vol. 11(9), pages 1-13, May.
    19. Victor Chernozhukov & Denis Chetverikov & Mert Demirer & Esther Duflo & Christian Hansen & Whitney Newey & James Robins, 2018. "Double/debiased machine learning for treatment and structural parameters," Econometrics Journal, Royal Economic Society, vol. 21(1), pages 1-68, February.
    20. Achim Ahrens & Arnab Bhattacharjee, 2015. "Two-Step Lasso Estimation of the Spatial Weights Matrix," Econometrics, MDPI, vol. 3(1), pages 1-28, March.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:bla:jorssb:v:82:y:2020:i:3:p:685-718. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Wiley Content Delivery (email available below). General contact details of provider: https://edirc.repec.org/data/rssssea.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.