IDEAS home Printed from https://ideas.repec.org/a/bla/biomet/v78y2022i4p1592-1603.html
   My bibliography  Save this article

Simultaneous feature selection and outlier detection with optimality guarantees

Author

Listed:
  • Luca Insolia
  • Ana Kenney
  • Francesca Chiaromonte
  • Giovanni Felici

Abstract

Biomedical research is increasingly data rich, with studies comprising ever growing numbers of features. The larger a study, the higher the likelihood that a substantial portion of the features may be redundant and/or contain contamination (outlying values). This poses serious challenges, which are exacerbated in cases where the sample sizes are relatively small. Effective and efficient approaches to perform sparse estimation in the presence of outliers are critical for these studies, and have received considerable attention in the last decade. We contribute to this area considering high‐dimensional regressions contaminated by multiple mean‐shift outliers affecting both the response and the design matrix. We develop a general framework and use mixed‐integer programming to simultaneously perform feature selection and outlier detection with provably optimal guarantees. We prove theoretical properties for our approach, that is, a necessary and sufficient condition for the robustly strong oracle property, where the number of features can increase exponentially with the sample size; the optimal estimation of parameters; and the breakdown point of the resulting estimates. Moreover, we provide computationally efficient procedures to tune integer constraints and warm‐start the algorithm. We show the superior performance of our proposal compared to existing heuristic methods through simulations and use it to study the relationships between childhood obesity and the human microbiome.

Suggested Citation

  • Luca Insolia & Ana Kenney & Francesca Chiaromonte & Giovanni Felici, 2022. "Simultaneous feature selection and outlier detection with optimality guarantees," Biometrics, The International Biometric Society, vol. 78(4), pages 1592-1603, December.
  • Handle: RePEc:bla:biomet:v:78:y:2022:i:4:p:1592-1603
    DOI: 10.1111/biom.13553
    as

    Download full text from publisher

    File URL: https://doi.org/10.1111/biom.13553
    Download Restriction: no

    File URL: https://libkey.io/10.1111/biom.13553?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Menjoge, Rajiv S. & Welsch, Roy E., 2010. "A diagnostic method for simultaneous feature selection and outlier identification in linear regression," Computational Statistics & Data Analysis, Elsevier, vol. 54(12), pages 3181-3193, December.
    2. Hadi, Ali S. & Luceno, Alberto, 1997. "Maximum trimmed likelihood estimators: a unified approach, examples, and algorithms," Computational Statistics & Data Analysis, Elsevier, vol. 25(3), pages 251-272, August.
    3. She, Yiyuan & Owen, Art B., 2011. "Outlier Detection Using Nonconvex Penalized Regression," Journal of the American Statistical Association, American Statistical Association, vol. 106(494), pages 626-639.
    4. Gatu, Cristian & Yanev, Petko I. & Kontoghiorghes, Erricos J., 2007. "A graph approach to generate all possible regression submodels," Computational Statistics & Data Analysis, Elsevier, vol. 52(2), pages 799-815, October.
    5. Muller, Samuel & Welsh, A.H., 2005. "Outlier Robust Model Selection in Linear Regression," Journal of the American Statistical Association, American Statistical Association, vol. 100, pages 1297-1310, December.
    6. G. Zioutas & L. Pitsoulis & A. Avramidis, 2009. "Quadratic mixed integer programming and support vectors for deleting outliers in robust regression," Annals of Operations Research, Springer, vol. 166(1), pages 339-353, February.
    7. Xiaotong Shen & Wei Pan & Yunzhang Zhu, 2012. "Likelihood-Based Selection and Sharp Parameter Estimation," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 107(497), pages 223-232, March.
    8. Fan J. & Li R., 2001. "Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 96, pages 1348-1360, December.
    9. Smucler, Ezequiel & Yohai, Victor J., 2017. "Robust and sparse estimators for linear regression models," Computational Statistics & Data Analysis, Elsevier, vol. 111(C), pages 116-130.
    10. Bernholt, Thorsten, 2006. "Robust Estimators are Hard to Compute," Technical Reports 2005,52, Technische Universität Dortmund, Sonderforschungsbereich 475: Komplexitätsreduktion in multivariaten Datenstrukturen.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Kepplinger, David, 2023. "Robust variable selection and estimation via adaptive elastic net S-estimators for linear regression," Computational Statistics & Data Analysis, Elsevier, vol. 183(C).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Luca Insolia & Ana Kenney & Martina Calovi & Francesca Chiaromonte, 2021. "Robust Variable Selection with Optimality Guarantees for High-Dimensional Logistic Regression," Stats, MDPI, vol. 4(3), pages 1-17, August.
    2. Thompson, Ryan, 2022. "Robust subset selection," Computational Statistics & Data Analysis, Elsevier, vol. 169(C).
    3. Tianxiang Liu & Ting Kei Pong & Akiko Takeda, 2019. "A refined convergence analysis of $$\hbox {pDCA}_{e}$$ pDCA e with applications to simultaneous sparse recovery and outlier detection," Computational Optimization and Applications, Springer, vol. 73(1), pages 69-100, May.
    4. Umberto Amato & Anestis Antoniadis & Italia De Feis & Irene Gijbels, 2021. "Penalised robust estimators for sparse and high-dimensional linear models," Statistical Methods & Applications, Springer;Società Italiana di Statistica, vol. 30(1), pages 1-48, March.
    5. Yize Zhao & Matthias Chung & Brent A. Johnson & Carlos S. Moreno & Qi Long, 2016. "Hierarchical Feature Selection Incorporating Known and Novel Biological Information: Identifying Genomic Features Related to Prostate Cancer Recurrence," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 111(516), pages 1427-1439, October.
    6. Xiaotong Shen & Wei Pan & Yunzhang Zhu & Hui Zhou, 2013. "On constrained and regularized high-dimensional regression," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 65(5), pages 807-832, October.
    7. Chenchen Ma & Jing Ouyang & Gongjun Xu, 2023. "Learning Latent and Hierarchical Structures in Cognitive Diagnosis Models," Psychometrika, Springer;The Psychometric Society, vol. 88(1), pages 175-207, March.
    8. Canhong Wen & Xueqin Wang & Shaoli Wang, 2015. "Laplace Error Penalty-based Variable Selection in High Dimension," Scandinavian Journal of Statistics, Danish Society for Theoretical Statistics;Finnish Statistical Society;Norwegian Statistical Association;Swedish Statistical Association, vol. 42(3), pages 685-700, September.
    9. Yongli Zhang & Xiaotong Shen, 2015. "Adaptive Modeling Procedure Selection by Data Perturbation," Journal of Business & Economic Statistics, Taylor & Francis Journals, vol. 33(4), pages 541-551, October.
    10. Yingli Pan & Zhan Liu & Guangyu Song, 2021. "Outlier detection under a covariate-adjusted exponential regression model with censored data," Computational Statistics, Springer, vol. 36(2), pages 961-976, June.
    11. Junlong Zhao & Chao Liu & Lu Niu & Chenlei Leng, 2019. "Multiple influential point detection in high dimensional regression spaces," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 81(2), pages 385-408, April.
    12. Z. John Daye & Jinbo Chen & Hongzhe Li, 2012. "High-Dimensional Heteroscedastic Regression with an Application to eQTL Data Analysis," Biometrics, The International Biometric Society, vol. 68(1), pages 316-326, March.
    13. Chun Wang & Jing Lu, 2021. "Learning Attribute Hierarchies From Data: Two Exploratory Approaches," Journal of Educational and Behavioral Statistics, , vol. 46(1), pages 58-84, February.
    14. Dai, Linlin & Chen, Kani & Sun, Zhihua & Liu, Zhenqiu & Li, Gang, 2018. "Broken adaptive ridge regression and its asymptotic properties," Journal of Multivariate Analysis, Elsevier, vol. 168(C), pages 334-351.
    15. Alexander Robitzsch, 2022. "Comparing the Robustness of the Structural after Measurement (SAM) Approach to Structural Equation Modeling (SEM) against Local Model Misspecifications with Alternative Estimation Approaches," Stats, MDPI, vol. 5(3), pages 1-42, July.
    16. Jun Zhao & Guan’ao Yan & Yi Zhang, 2022. "Robust estimation and shrinkage in ultrahigh dimensional expectile regression with heavy tails and variance heterogeneity," Statistical Papers, Springer, vol. 63(1), pages 1-28, February.
    17. Farnè, Matteo & Vouldis, Angelos T., 2018. "A methodology for automised outlier detection in high-dimensional datasets: an application to euro area banks' supervisory data," Working Paper Series 2171, European Central Bank.
    18. Kwon, Sunghoon & Oh, Seungyoung & Lee, Youngjo, 2016. "The use of random-effect models for high-dimensional variable selection problems," Computational Statistics & Data Analysis, Elsevier, vol. 103(C), pages 401-412.
    19. Sunkyung Kim & Wei Pan & Xiaotong Shen, 2013. "Network-Based Penalized Regression With Application to Genomic Data," Biometrics, The International Biometric Society, vol. 69(3), pages 582-593, September.
    20. Wang, Yibo & Karunamuni, Rohana J., 2022. "High-dimensional robust regression with Lq-loss functions," Computational Statistics & Data Analysis, Elsevier, vol. 176(C).

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:bla:biomet:v:78:y:2022:i:4:p:1592-1603. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Wiley Content Delivery (email available below). General contact details of provider: http://www.blackwellpublishing.com/journal.asp?ref=0006-341X .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.