IDEAS home Printed from https://ideas.repec.org/a/gam/jstats/v8y2025i2p45-d1669294.html
   My bibliography  Save this article

Evaluating Estimator Performance Under Multicollinearity: A Trade-Off Between MSE and Accuracy in Logistic, Lasso, Elastic Net, and Ridge Regression with Varying Penalty Parameters

Author

Listed:
  • H. M. Nayem

    (Department of Mathematics and Statistics, Florida International University, Miami, FL 33199, USA)

  • Sinha Aziz

    (Department of Mathematics and Statistics, Florida International University, Miami, FL 33199, USA)

  • B. M. Golam Kibria

    (Department of Mathematics and Statistics, Florida International University, Miami, FL 33199, USA)

Abstract

Multicollinearity in logistic regression models can result in inflated variances and yield unreliable estimates of parameters. Ridge regression, a regularized estimation technique, is frequently employed to address this issue. This study conducts a comparative evaluation of the performance of 23 established ridge regression estimators alongside Logistic Regression, Elastic-Net, Lasso, and Generalized Ridge Regression (GRR), considering various levels of multicollinearity within the context of logistic regression settings. Simulated datasets with high correlations (0.80, 0.90, 0.95, and 0.99) and real-world data (municipal and cancer remission) were analyzed. Both results show that ridge estimators, such as k A L 1 , k A L 2 , k K L 1 , and k K L 2 , exhibit strong performance in terms of Mean Squared Error (MSE) and accuracy, particularly in smaller samples, while GRR demonstrates superior performance in large samples. Real-world data further confirm that GRR achieves the lowest MSE in highly collinear municipal data, while ridge estimators and GRR help prevent overfitting in small-sample cancer remission data. The results underscore the efficacy of ridge estimators and GRR in handling multicollinearity, offering reliable alternatives to traditional regression techniques, especially for datasets with high correlations and varying sample sizes.

Suggested Citation

  • H. M. Nayem & Sinha Aziz & B. M. Golam Kibria, 2025. "Evaluating Estimator Performance Under Multicollinearity: A Trade-Off Between MSE and Accuracy in Logistic, Lasso, Elastic Net, and Ridge Regression with Varying Penalty Parameters," Stats, MDPI, vol. 8(2), pages 1-19, May.
  • Handle: RePEc:gam:jstats:v:8:y:2025:i:2:p:45-:d:1669294
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2571-905X/8/2/45/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2571-905X/8/2/45/
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. M. Arashi & T. Valizadeh, 2015. "Performance of Kibria’s methods in partial linear ridge regression model," Statistical Papers, Springer, vol. 56(1), pages 231-246, February.
    2. Fan J. & Li R., 2001. "Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 96, pages 1348-1360, December.
    3. Ming Yuan & Yi Lin, 2007. "On the non‐negative garrotte estimator," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 69(2), pages 143-161, April.
    4. Mustafa I. Alheety & HM Nayem & B. M. Golam Kibria, 2025. "An Unbiased Convex Estimator Depending on Prior Information for the Classical Linear Regression Model," Stats, MDPI, vol. 8(1), pages 1-33, February.
    5. Hui Zou & Trevor Hastie, 2005. "Addendum: Regularization and variable selection via the elastic net," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 67(5), pages 768-768, November.
    6. Hui Zou & Trevor Hastie, 2005. "Regularization and variable selection via the elastic net," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 67(2), pages 301-320, April.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Xia, Siwei & Yang, Yuehan & Yang, Hu, 2023. "High-dimensional sparse portfolio selection with nonnegative constraint," Applied Mathematics and Computation, Elsevier, vol. 443(C).
    2. Mogliani, Matteo & Simoni, Anna, 2021. "Bayesian MIDAS penalized regressions: Estimation, selection, and prediction," Journal of Econometrics, Elsevier, vol. 222(1), pages 833-860.
    3. Fang, Xiaolei & Paynabar, Kamran & Gebraeel, Nagi, 2017. "Multistream sensor fusion-based prognostics model for systems with single failure modes," Reliability Engineering and System Safety, Elsevier, vol. 159(C), pages 322-331.
    4. Belli, Edoardo, 2022. "Smoothly adaptively centered ridge estimator," Journal of Multivariate Analysis, Elsevier, vol. 189(C).
    5. Alena Skolkova, 2023. "Instrumental Variable Estimation with Many Instruments Using Elastic-Net IV," CERGE-EI Working Papers wp759, The Center for Economic Research and Graduate Education - Economics Institute, Prague.
    6. Ning Li & Hu Yang, 2021. "Nonnegative estimation and variable selection under minimax concave penalty for sparse high-dimensional linear regression models," Statistical Papers, Springer, vol. 62(2), pages 661-680, April.
    7. Wang, Jia & Cai, Xizhen & Li, Runze, 2021. "Variable selection for partially linear models via Bayesian subset modeling with diffusing prior," Journal of Multivariate Analysis, Elsevier, vol. 183(C).
    8. Wei Sun & Lexin Li, 2012. "Multiple Loci Mapping via Model-free Variable Selection," Biometrics, The International Biometric Society, vol. 68(1), pages 12-22, March.
    9. Yichao Wu, 2011. "An ordinary differential equation-based solution path algorithm," Journal of Nonparametric Statistics, Taylor & Francis Journals, vol. 23(1), pages 185-199.
    10. Tutz, Gerhard & Pößnecker, Wolfgang & Uhlmann, Lorenz, 2015. "Variable selection in general multinomial logit models," Computational Statistics & Data Analysis, Elsevier, vol. 82(C), pages 207-222.
    11. Margherita Giuzio, 2017. "Genetic algorithm versus classical methods in sparse index tracking," Decisions in Economics and Finance, Springer;Associazione per la Matematica, vol. 40(1), pages 243-256, November.
    12. Kim, Hyun Hak & Swanson, Norman R., 2018. "Mining big data using parsimonious factor, machine learning, variable selection and shrinkage methods," International Journal of Forecasting, Elsevier, vol. 34(2), pages 339-354.
    13. Shuichi Kawano, 2014. "Selection of tuning parameters in bridge regression models via Bayesian information criterion," Statistical Papers, Springer, vol. 55(4), pages 1207-1223, November.
    14. Yize Zhao & Matthias Chung & Brent A. Johnson & Carlos S. Moreno & Qi Long, 2016. "Hierarchical Feature Selection Incorporating Known and Novel Biological Information: Identifying Genomic Features Related to Prostate Cancer Recurrence," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 111(516), pages 1427-1439, October.
    15. Qianyun Li & Runmin Shi & Faming Liang, 2019. "Drug sensitivity prediction with high-dimensional mixture regression," PLOS ONE, Public Library of Science, vol. 14(2), pages 1-18, February.
    16. Changrong Yan & Dixin Zhang, 2013. "Sparse dimension reduction for survival data," Computational Statistics, Springer, vol. 28(4), pages 1835-1852, August.
    17. Gareth M. James & Peter Radchenko & Jinchi Lv, 2009. "DASSO: connections between the Dantzig selector and lasso," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 71(1), pages 127-142, January.
    18. Soave, David & Lawless, Jerald F., 2023. "Regularized regression for two phase failure time studies," Computational Statistics & Data Analysis, Elsevier, vol. 182(C).
    19. Zemin Zheng & Jie Zhang & Yang Li, 2022. "L 0 -Regularized Learning for High-Dimensional Additive Hazards Regression," INFORMS Journal on Computing, INFORMS, vol. 34(5), pages 2762-2775, September.
    20. Alexander Chudik & George Kapetanios & M. Hashem Pesaran, 2016. "Big Data Analytics: A New Perspective," CESifo Working Paper Series 5824, CESifo.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jstats:v:8:y:2025:i:2:p:45-:d:1669294. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.