IDEAS home Printed from https://ideas.repec.org/a/gam/jmathe/v10y2022i20p3824-d943815.html
   My bibliography  Save this article

A Double-Penalized Estimator to Combat Separation and Multicollinearity in Logistic Regression

Author

Listed:
  • Ying Guan

    (School of Science, Kunming University of Science and Technology, Kunming 650500, China
    Center for Applied Statistics, Kunming University of Science and Technology, Kunming 650500, China)

  • Guang-Hui Fu

    (School of Science, Kunming University of Science and Technology, Kunming 650500, China
    Center for Applied Statistics, Kunming University of Science and Technology, Kunming 650500, China)

Abstract

When developing prediction models for small or sparse binary data with many highly correlated covariates, logistic regression often encounters separation or multicollinearity problems, resulting serious bias and even the nonexistence of standard maximum likelihood estimates. The combination of separation and multicollinearity makes the task of logistic regression more difficult, and a few studies addressed separation and multicollinearity simultaneously. In this paper, we propose a double-penalized method called lFRE to combat separation and multicollinearity in logistic regression. lFRE combines the log F -type penalty with the ridge penalty. The results indicate that compared with other penalty methods, lFRE can not only effectively remove bias from predicted probabilities but also provide the minimum mean squared prediction error. Aside from that, a real dataset is also employed to test the performance of the lFRE algorithm compared with several existing methods. The result shows that lFRE has strong competitiveness compared with them and can be used as an alternative algorithm in logistic regression to solve separation and multicollinearity problems.

Suggested Citation

  • Ying Guan & Guang-Hui Fu, 2022. "A Double-Penalized Estimator to Combat Separation and Multicollinearity in Logistic Regression," Mathematics, MDPI, vol. 10(20), pages 1-19, October.
  • Handle: RePEc:gam:jmathe:v:10:y:2022:i:20:p:3824-:d:943815
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2227-7390/10/20/3824/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2227-7390/10/20/3824/
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Rousseeuw, Peter J. & Christmann, Andreas, 2003. "Robustness against separation and outliers in logistic regression," Computational Statistics & Data Analysis, Elsevier, vol. 43(3), pages 315-332, July.
    2. Zorn, Christopher, 2005. "A Solution to Separation in Binary Response Models," Political Analysis, Cambridge University Press, vol. 13(2), pages 157-170, April.
    3. Friedman, Jerome H. & Hastie, Trevor & Tibshirani, Rob, 2010. "Regularization Paths for Generalized Linear Models via Coordinate Descent," Journal of Statistical Software, Foundation for Open Access Statistics, vol. 33(i01).
    4. Emmanuel O. Ogundimu, 2019. "Prediction of default probability by using statistical models for rare events," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 182(4), pages 1143-1162, October.
    5. S. le Cessie & J. C. van Houwelingen, 1992. "Ridge Estimators in Logistic Regression," Journal of the Royal Statistical Society Series C, Royal Statistical Society, vol. 41(1), pages 191-201, March.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Luca Insolia & Ana Kenney & Martina Calovi & Francesca Chiaromonte, 2021. "Robust Variable Selection with Optimality Guarantees for High-Dimensional Logistic Regression," Stats, MDPI, vol. 4(3), pages 1-17, August.
    2. Christopher J Greenwood & George J Youssef & Primrose Letcher & Jacqui A Macdonald & Lauryn J Hagg & Ann Sanson & Jenn Mcintosh & Delyse M Hutchinson & John W Toumbourou & Matthew Fuller-Tyszkiewicz &, 2020. "A comparison of penalised regression methods for informing the selection of predictive markers," PLOS ONE, Public Library of Science, vol. 15(11), pages 1-14, November.
    3. Zanin, Luca, 2020. "Combining multiple probability predictions in the presence of class imbalance to discriminate between potential bad and good borrowers in the peer-to-peer lending market," Journal of Behavioral and Experimental Finance, Elsevier, vol. 25(C).
    4. Kamiar Rahnama Rad & Arian Maleki, 2020. "A scalable estimate of the out‐of‐sample prediction error via approximate leave‐one‐out cross‐validation," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 82(4), pages 965-996, September.
    5. Adam C. Sales & Ben B. Hansen & Brian Rowan, 2018. "Rebar: Reinforcing a Matching Estimator With Predictions From High-Dimensional Covariates," Journal of Educational and Behavioral Statistics, , vol. 43(1), pages 3-31, February.
    6. Tutz, Gerhard & Leitenstorfer, Florian, 2006. "Response shrinkage estimators in binary regression," Computational Statistics & Data Analysis, Elsevier, vol. 50(10), pages 2878-2901, June.
    7. Faisal Zahid & Gerhard Tutz, 2013. "Ridge estimation for multinomial logit models with symmetric side constraints," Computational Statistics, Springer, vol. 28(3), pages 1017-1034, June.
    8. Tutz, Gerhard & Pößnecker, Wolfgang & Uhlmann, Lorenz, 2015. "Variable selection in general multinomial logit models," Computational Statistics & Data Analysis, Elsevier, vol. 82(C), pages 207-222.
    9. Ernesto Carrella & Richard M. Bailey & Jens Koed Madsen, 2018. "Indirect inference through prediction," Papers 1807.01579, arXiv.org.
    10. Marianna Belloc & Francesco Drago & Roberto Galbiati, 2016. "Earthquakes, Religion, and Transition to Self-Government in ItalianCities," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 131(4), pages 1875-1926.
    11. Rui Wang & Naihua Xiu & Kim-Chuan Toh, 2021. "Subspace quadratic regularization method for group sparse multinomial logistic regression," Computational Optimization and Applications, Springer, vol. 79(3), pages 531-559, July.
    12. Mkhadri, Abdallah & Ouhourane, Mohamed, 2013. "An extended variable inclusion and shrinkage algorithm for correlated variables," Computational Statistics & Data Analysis, Elsevier, vol. 57(1), pages 631-644.
    13. Masakazu Higuchi & Mitsuteru Nakamura & Shuji Shinohara & Yasuhiro Omiya & Takeshi Takano & Daisuke Mizuguchi & Noriaki Sonota & Hiroyuki Toda & Taku Saito & Mirai So & Eiji Takayama & Hiroo Terashi &, 2022. "Detection of Major Depressive Disorder Based on a Combination of Voice Features: An Exploratory Approach," IJERPH, MDPI, vol. 19(18), pages 1-13, September.
    14. Susan Athey & Guido W. Imbens & Stefan Wager, 2018. "Approximate residual balancing: debiased inference of average treatment effects in high dimensions," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 80(4), pages 597-623, September.
    15. Vincent, Martin & Hansen, Niels Richard, 2014. "Sparse group lasso and high dimensional multinomial classification," Computational Statistics & Data Analysis, Elsevier, vol. 71(C), pages 771-786.
    16. Chen, Le-Yu & Lee, Sokbae, 2018. "Best subset binary prediction," Journal of Econometrics, Elsevier, vol. 206(1), pages 39-56.
    17. Perrot-Dockès Marie & Lévy-Leduc Céline & Chiquet Julien & Sansonnet Laure & Brégère Margaux & Étienne Marie-Pierre & Robin Stéphane & Genta-Jouve Grégory, 2018. "A variable selection approach in the multivariate linear model: an application to LC-MS metabolomics data," Statistical Applications in Genetics and Molecular Biology, De Gruyter, vol. 17(5), pages 1-14, October.
    18. Fan, Jianqing & Jiang, Bai & Sun, Qiang, 2022. "Bayesian factor-adjusted sparse regression," Journal of Econometrics, Elsevier, vol. 230(1), pages 3-19.
    19. Jun Li & Serguei Netessine & Sergei Koulayev, 2018. "Price to Compete … with Many: How to Identify Price Competition in High-Dimensional Space," Management Science, INFORMS, vol. 64(9), pages 4118-4136, September.
    20. Sung Jae Jun & Sokbae Lee, 2020. "Causal Inference under Outcome-Based Sampling with Monotonicity Assumptions," Papers 2004.08318, arXiv.org, revised Oct 2023.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jmathe:v:10:y:2022:i:20:p:3824-:d:943815. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.