IDEAS home Printed from https://ideas.repec.org/a/eee/csdana/v180y2023ics016794732200202x.html
   My bibliography  Save this article

Estimation of predictive performance in high-dimensional data settings using learning curves

Author

Listed:
  • Goedhart, Jeroen M.
  • Klausch, Thomas
  • van de Wiel, Mark A.

Abstract

In high-dimensional prediction settings, it remains challenging to reliably estimate the test performance. To address this challenge, a novel performance estimation framework is presented. This framework, called Learn2Evaluate, is based on learning curves by fitting a smooth monotone curve depicting test performance as a function of the sample size. Learn2Evaluate has several advantages compared to commonly applied performance estimation methodologies. Firstly, a learning curve offers a graphical overview of a learner. This overview assists in assessing the potential benefit of adding training samples and it provides a more complete comparison between learners than performance estimates at a fixed subsample size. Secondly, a learning curve facilitates in estimating the performance at the total sample size rather than a subsample size. Thirdly, Learn2Evaluate allows the computation of a theoretically justified and useful lower confidence bound. Furthermore, this bound may be tightened by performing a bias correction. The benefits of Learn2Evaluate are illustrated by a simulation study and applications to omics data.

Suggested Citation

  • Goedhart, Jeroen M. & Klausch, Thomas & van de Wiel, Mark A., 2023. "Estimation of predictive performance in high-dimensional data settings using learning curves," Computational Statistics & Data Analysis, Elsevier, vol. 180(C).
  • Handle: RePEc:eee:csdana:v:180:y:2023:i:c:s016794732200202x
    DOI: 10.1016/j.csda.2022.107622
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S016794732200202X
    Download Restriction: Full text for ScienceDirect subscribers only.

    File URL: https://libkey.io/10.1016/j.csda.2022.107622?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Friedman, Jerome H. & Hastie, Trevor & Tibshirani, Rob, 2010. "Regularization Paths for Generalized Linear Models via Coordinate Descent," Journal of Statistical Software, Foundation for Open Access Statistics, vol. 33(i01).
    2. Jiang Wenyu & Varma Sudhir & Simon Richard, 2008. "Calculating Confidence Intervals for Prediction Error in Microarray Classification Using Resampling," Statistical Applications in Genetics and Molecular Biology, De Gruyter, vol. 7(1), pages 1-22, March.
    3. Schäfer Juliane & Strimmer Korbinian, 2005. "A Shrinkage Approach to Large-Scale Covariance Matrix Estimation and Implications for Functional Genomics," Statistical Applications in Genetics and Molecular Biology, De Gruyter, vol. 4(1), pages 1-32, November.
    4. Kim, Ji-Hyun, 2009. "Estimating classification error rate: Repeated cross-validation, repeated hold-out and bootstrap," Computational Statistics & Data Analysis, Elsevier, vol. 53(11), pages 3735-3745, September.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Aderhold Andrej & Husmeier Dirk & Grzegorczyk Marco, 2014. "Statistical inference of regulatory networks for circadian regulation," Statistical Applications in Genetics and Molecular Biology, De Gruyter, vol. 13(3), pages 1-47, June.
    2. Xiaodong Cai & Juan Andrés Bazerque & Georgios B Giannakis, 2013. "Inference of Gene Regulatory Networks with Sparse Structural Equation Models Exploiting Genetic Perturbations," PLOS Computational Biology, Public Library of Science, vol. 9(5), pages 1-13, May.
    3. Blum Yuna & Houée-Bigot Magalie & Causeur David, 2016. "Sparse factor model for co-expression networks with an application using prior biological knowledge," Statistical Applications in Genetics and Molecular Biology, De Gruyter, vol. 15(3), pages 253-272, June.
    4. Zhengnan Huang & Hongjiu Zhang & Jonathan Boss & Stephen A Goutman & Bhramar Mukherjee & Ivo D Dinov & Yuanfang Guan & for the Pooled Resource Open-Access ALS Clinical Trials Consortium, 2017. "Complete hazard ranking to analyze right-censored data: An ALS survival study," PLOS Computational Biology, Public Library of Science, vol. 13(12), pages 1-21, December.
    5. Guibert, Quentin & Lopez, Olivier & Piette, Pierrick, 2019. "Forecasting mortality rate improvements with a high-dimensional VAR," Insurance: Mathematics and Economics, Elsevier, vol. 88(C), pages 255-272.
    6. Tutz, Gerhard & Pößnecker, Wolfgang & Uhlmann, Lorenz, 2015. "Variable selection in general multinomial logit models," Computational Statistics & Data Analysis, Elsevier, vol. 82(C), pages 207-222.
    7. Hannart, Alexis & Naveau, Philippe, 2014. "Estimating high dimensional covariance matrices: A new look at the Gaussian conjugate framework," Journal of Multivariate Analysis, Elsevier, vol. 131(C), pages 149-162.
    8. Ernesto Carrella & Richard M. Bailey & Jens Koed Madsen, 2018. "Indirect inference through prediction," Papers 1807.01579, arXiv.org.
    9. Rui Wang & Naihua Xiu & Kim-Chuan Toh, 2021. "Subspace quadratic regularization method for group sparse multinomial logistic regression," Computational Optimization and Applications, Springer, vol. 79(3), pages 531-559, July.
    10. Mkhadri, Abdallah & Ouhourane, Mohamed, 2013. "An extended variable inclusion and shrinkage algorithm for correlated variables," Computational Statistics & Data Analysis, Elsevier, vol. 57(1), pages 631-644.
    11. Masakazu Higuchi & Mitsuteru Nakamura & Shuji Shinohara & Yasuhiro Omiya & Takeshi Takano & Daisuke Mizuguchi & Noriaki Sonota & Hiroyuki Toda & Taku Saito & Mirai So & Eiji Takayama & Hiroo Terashi &, 2022. "Detection of Major Depressive Disorder Based on a Combination of Voice Features: An Exploratory Approach," IJERPH, MDPI, vol. 19(18), pages 1-13, September.
    12. Avagyan, Vahe & Alonso Fernández, Andrés Modesto & Nogales, Francisco J., 2015. "D-trace Precision Matrix Estimation Using Adaptive Lasso Penalties," DES - Working Papers. Statistics and Econometrics. WS 21775, Universidad Carlos III de Madrid. Departamento de Estadística.
    13. Susan Athey & Guido W. Imbens & Stefan Wager, 2018. "Approximate residual balancing: debiased inference of average treatment effects in high dimensions," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 80(4), pages 597-623, September.
    14. Vincent, Martin & Hansen, Niels Richard, 2014. "Sparse group lasso and high dimensional multinomial classification," Computational Statistics & Data Analysis, Elsevier, vol. 71(C), pages 771-786.
    15. Chen, Le-Yu & Lee, Sokbae, 2018. "Best subset binary prediction," Journal of Econometrics, Elsevier, vol. 206(1), pages 39-56.
    16. Perrot-Dockès Marie & Lévy-Leduc Céline & Chiquet Julien & Sansonnet Laure & Brégère Margaux & Étienne Marie-Pierre & Robin Stéphane & Genta-Jouve Grégory, 2018. "A variable selection approach in the multivariate linear model: an application to LC-MS metabolomics data," Statistical Applications in Genetics and Molecular Biology, De Gruyter, vol. 17(5), pages 1-14, October.
    17. Fan, Jianqing & Jiang, Bai & Sun, Qiang, 2022. "Bayesian factor-adjusted sparse regression," Journal of Econometrics, Elsevier, vol. 230(1), pages 3-19.
    18. Jun Li & Serguei Netessine & Sergei Koulayev, 2018. "Price to Compete … with Many: How to Identify Price Competition in High-Dimensional Space," Management Science, INFORMS, vol. 64(9), pages 4118-4136, September.
    19. Sung Jae Jun & Sokbae Lee, 2020. "Causal Inference under Outcome-Based Sampling with Monotonicity Assumptions," Papers 2004.08318, arXiv.org, revised Oct 2023.
    20. Rina Friedberg & Julie Tibshirani & Susan Athey & Stefan Wager, 2018. "Local Linear Forests," Papers 1807.11408, arXiv.org, revised Sep 2020.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:csdana:v:180:y:2023:i:c:s016794732200202x. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/csda .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.