IDEAS home Printed from https://ideas.repec.org/a/bla/jorssb/v71y2009i3p615-636.html
   My bibliography  Save this article

Covariance‐regularized regression and classification for high dimensional problems

Author

Listed:
  • Daniela M. Witten
  • Robert Tibshirani

Abstract

Summary. We propose covariance‐regularized regression, a family of methods for prediction in high dimensional settings that uses a shrunken estimate of the inverse covariance matrix of the features to achieve superior prediction. An estimate of the inverse covariance matrix is obtained by maximizing the log‐likelihood of the data, under a multivariate normal model, subject to a penalty; it is then used to estimate coefficients for the regression of the response onto the features. We show that ridge regression, the lasso and the elastic net are special cases of covariance‐regularized regression, and we demonstrate that certain previously unexplored forms of covariance‐regularized regression can outperform existing methods in a range of situations. The covariance‐regularized regression framework is extended to generalized linear models and linear discriminant analysis, and is used to analyse gene expression data sets with multiple class and survival outcomes.

Suggested Citation

  • Daniela M. Witten & Robert Tibshirani, 2009. "Covariance‐regularized regression and classification for high dimensional problems," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 71(3), pages 615-636, June.
  • Handle: RePEc:bla:jorssb:v:71:y:2009:i:3:p:615-636
    DOI: 10.1111/j.1467-9868.2009.00699.x
    as

    Download full text from publisher

    File URL: https://doi.org/10.1111/j.1467-9868.2009.00699.x
    Download Restriction: no

    File URL: https://libkey.io/10.1111/j.1467-9868.2009.00699.x?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Hui Zou & Trevor Hastie, 2005. "Addendum: Regularization and variable selection via the elastic net," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 67(5), pages 768-768, November.
    2. Hui Zou & Trevor Hastie, 2005. "Regularization and variable selection via the elastic net," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 67(2), pages 301-320, April.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Matthias Weber & Martin Schumacher & Harald Binder, 2014. "Regularized Regression Incorporating Network Information: Simultaneous Estimation of Covariate Coefficients and Connection Signs," Tinbergen Institute Discussion Papers 14-089/I, Tinbergen Institute.
    2. Le, Khuyen T. & Chaux, Caroline & Richard, Frédéric J.P. & Guedj, Eric, 2020. "An adapted linear discriminant analysis with variable selection for the classification in high-dimension, and an application to medical data," Computational Statistics & Data Analysis, Elsevier, vol. 152(C).
    3. Guan Yu & Yufeng Liu, 2016. "Sparse Regression Incorporating Graphical Structure Among Predictors," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 111(514), pages 707-720, April.
    4. Jos'e Vin'icius de Miranda Cardoso & Jiaxi Ying & Daniel Perez Palomar, 2020. "Algorithms for Learning Graphs in Financial Markets," Papers 2012.15410, arXiv.org.
    5. Ollier, Edouard & Samson, Adeline & Delavenne, Xavier & Viallon, Vivian, 2016. "A SAEM algorithm for fused lasso penalized NonLinear Mixed Effect Models: Application to group comparison in pharmacokinetics," Computational Statistics & Data Analysis, Elsevier, vol. 95(C), pages 207-221.
    6. Aaron J Molstad & Adam J Rothman, 2018. "Shrinking characteristics of precision matrix estimators," Biometrika, Biometrika Trust, vol. 105(3), pages 563-574.
    7. van Wieringen, Wessel N. & Peeters, Carel F.W., 2016. "Ridge estimation of inverse covariance matrices from high-dimensional data," Computational Statistics & Data Analysis, Elsevier, vol. 103(C), pages 284-303.
    8. Luo, Shan & Chen, Zehua, 2020. "A procedure of linear discrimination analysis with detected sparsity structure for high-dimensional multi-class classification," Journal of Multivariate Analysis, Elsevier, vol. 179(C).
    9. Vincent Guillemot & Andreas Bender & Anne-Laure Boulesteix, 2013. "Iterative Reconstruction of High-Dimensional Gaussian Graphical Models Based on a New Method to Estimate Partial Correlations under Constraints," PLOS ONE, Public Library of Science, vol. 8(4), pages 1-10, April.
    10. David Hallac & Peter Nystrup & Stephen Boyd, 2019. "Greedy Gaussian segmentation of multivariate time series," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 13(3), pages 727-751, September.
    11. Tomaso Aste & T. Di Matteo, 2017. "Sparse Causality Network Retrieval from Short Time Series," Complexity, Hindawi, vol. 2017, pages 1-13, November.
    12. Matteo Barigozzi & Matteo Luciani, 2019. "Quasi Maximum Likelihood Estimation and Inference of Large Approximate Dynamic Factor Models via the EM algorithm," Papers 1910.03821, arXiv.org, revised Feb 2022.
    13. L. A. Stefanski & Yichao Wu & Kyle White, 2014. "Variable Selection in Nonparametric Classification Via Measurement Error Model Selection Likelihoods," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 109(506), pages 574-589, June.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Tutz, Gerhard & Pößnecker, Wolfgang & Uhlmann, Lorenz, 2015. "Variable selection in general multinomial logit models," Computational Statistics & Data Analysis, Elsevier, vol. 82(C), pages 207-222.
    2. Oxana Babecka Kucharcukova & Jan Bruha, 2016. "Nowcasting the Czech Trade Balance," Working Papers 2016/11, Czech National Bank.
    3. Carstensen, Kai & Heinrich, Markus & Reif, Magnus & Wolters, Maik H., 2020. "Predicting ordinary and severe recessions with a three-state Markov-switching dynamic factor model," International Journal of Forecasting, Elsevier, vol. 36(3), pages 829-850.
    4. Hou-Tai Chang & Ping-Huai Wang & Wei-Fang Chen & Chen-Ju Lin, 2022. "Risk Assessment of Early Lung Cancer with LDCT and Health Examinations," IJERPH, MDPI, vol. 19(8), pages 1-12, April.
    5. Margherita Giuzio, 2017. "Genetic algorithm versus classical methods in sparse index tracking," Decisions in Economics and Finance, Springer;Associazione per la Matematica, vol. 40(1), pages 243-256, November.
    6. Nicolaj N. Mühlbach, 2020. "Tree-based Synthetic Control Methods: Consequences of moving the US Embassy," CREATES Research Papers 2020-04, Department of Economics and Business Economics, Aarhus University.
    7. Wang, Qiao & Zhou, Wei & Cheng, Yonggang & Ma, Gang & Chang, Xiaolin & Miao, Yu & Chen, E, 2018. "Regularized moving least-square method and regularized improved interpolating moving least-square method with nonsingular moment matrices," Applied Mathematics and Computation, Elsevier, vol. 325(C), pages 120-145.
    8. Dmitriy Drusvyatskiy & Adrian S. Lewis, 2018. "Error Bounds, Quadratic Growth, and Linear Convergence of Proximal Methods," Mathematics of Operations Research, INFORMS, vol. 43(3), pages 919-948, August.
    9. Mkhadri, Abdallah & Ouhourane, Mohamed, 2013. "An extended variable inclusion and shrinkage algorithm for correlated variables," Computational Statistics & Data Analysis, Elsevier, vol. 57(1), pages 631-644.
    10. Lucian Belascu & Alexandra Horobet & Georgiana Vrinceanu & Consuela Popescu, 2021. "Performance Dissimilarities in European Union Manufacturing: The Effect of Ownership and Technological Intensity," Sustainability, MDPI, vol. 13(18), pages 1-19, September.
    11. Candelon, B. & Hurlin, C. & Tokpavi, S., 2012. "Sampling error and double shrinkage estimation of minimum variance portfolios," Journal of Empirical Finance, Elsevier, vol. 19(4), pages 511-527.
    12. Susan Athey & Guido W. Imbens & Stefan Wager, 2018. "Approximate residual balancing: debiased inference of average treatment effects in high dimensions," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 80(4), pages 597-623, September.
    13. Andrea Carriero & Todd E. Clark & Massimiliano Marcellino, 2022. "Specification Choices in Quantile Regression for Empirical Macroeconomics," Working Papers 22-25, Federal Reserve Bank of Cleveland.
    14. Kim, Hyun Hak & Swanson, Norman R., 2018. "Mining big data using parsimonious factor, machine learning, variable selection and shrinkage methods," International Journal of Forecasting, Elsevier, vol. 34(2), pages 339-354.
    15. Shuichi Kawano, 2014. "Selection of tuning parameters in bridge regression models via Bayesian information criterion," Statistical Papers, Springer, vol. 55(4), pages 1207-1223, November.
    16. Yize Zhao & Matthias Chung & Brent A. Johnson & Carlos S. Moreno & Qi Long, 2016. "Hierarchical Feature Selection Incorporating Known and Novel Biological Information: Identifying Genomic Features Related to Prostate Cancer Recurrence," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 111(516), pages 1427-1439, October.
    17. Enrico Bergamini & Georg Zachmann, 2020. "Exploring EU’s Regional Potential in Low-Carbon Technologies," Sustainability, MDPI, vol. 13(1), pages 1-28, December.
    18. Qianyun Li & Runmin Shi & Faming Liang, 2019. "Drug sensitivity prediction with high-dimensional mixture regression," PLOS ONE, Public Library of Science, vol. 14(2), pages 1-18, February.
    19. Jung, Yoon Mo & Whang, Joyce Jiyoung & Yun, Sangwoon, 2020. "Sparse probabilistic K-means," Applied Mathematics and Computation, Elsevier, vol. 382(C).
    20. Changrong Yan & Dixin Zhang, 2013. "Sparse dimension reduction for survival data," Computational Statistics, Springer, vol. 28(4), pages 1835-1852, August.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:bla:jorssb:v:71:y:2009:i:3:p:615-636. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Wiley Content Delivery (email available below). General contact details of provider: https://edirc.repec.org/data/rssssea.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.