IDEAS home Printed from https://ideas.repec.org/a/spr/advdac/v14y2020i1d10.1007_s11634-019-00364-9.html
   My bibliography  Save this article

Ensemble of optimal trees, random forest and random projection ensemble classification

Author

Listed:
  • Zardad Khan

    (Abdul Wali Khan University
    University of Essex)

  • Asma Gul

    (University of Essex
    Shaheed Benazir Bhutto Women University)

  • Aris Perperoglou

    (University of Essex)

  • Miftahuddin Miftahuddin

    (University of Essex
    Syiah Kuala University)

  • Osama Mahmoud

    (University of Essex
    Helwan University
    University of Bristol)

  • Werner Adler

    (University of Erlangen-Nuremberg)

  • Berthold Lausen

    (University of Essex)

Abstract

The predictive performance of a random forest ensemble is highly associated with the strength of individual trees and their diversity. Ensemble of a small number of accurate and diverse trees, if prediction accuracy is not compromised, will also reduce computational burden. We investigate the idea of integrating trees that are accurate and diverse. For this purpose, we utilize out-of-bag observations as a validation sample from the training bootstrap samples, to choose the best trees based on their individual performance and then assess these trees for diversity using the Brier score on an independent validation sample. Starting from the first best tree, a tree is selected for the final ensemble if its addition to the forest reduces error of the trees that have already been added. Our approach does not use an implicit dimension reduction for each tree as random project ensemble classification. A total of 35 bench mark problems on classification and regression are used to assess the performance of the proposed method and compare it with random forest, random projection ensemble, node harvest, support vector machine, kNN and classification and regression tree. We compute unexplained variances or classification error rates for all the methods on the corresponding data sets. Our experiments reveal that the size of the ensemble is reduced significantly and better results are obtained in most of the cases. Results of a simulation study are also given where four tree style scenarios are considered to generate data sets with several structures.

Suggested Citation

  • Zardad Khan & Asma Gul & Aris Perperoglou & Miftahuddin Miftahuddin & Osama Mahmoud & Werner Adler & Berthold Lausen, 2020. "Ensemble of optimal trees, random forest and random projection ensemble classification," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 14(1), pages 97-116, March.
  • Handle: RePEc:spr:advdac:v:14:y:2020:i:1:d:10.1007_s11634-019-00364-9
    DOI: 10.1007/s11634-019-00364-9
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11634-019-00364-9
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11634-019-00364-9?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Hapfelmeier, A. & Ulm, K., 2013. "A new variable selection approach using Random Forests," Computational Statistics & Data Analysis, Elsevier, vol. 60(C), pages 50-69.
    2. Panagiotis Tzirakis & Christos Tjortjis, 2017. "T3C: improving a decision tree classification algorithm’s interval splits on continuous attributes," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 11(2), pages 353-370, June.
    3. Timothy I. Cannings & Richard J. Samworth, 2017. "Random-projection ensemble classification," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 79(4), pages 959-1035, September.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Gerhard Tutz, 2022. "Ordinal Trees and Random Forests: Score-Free Recursive Partitioning and Improved Ensembles," Journal of Classification, Springer;The Classification Society, vol. 39(2), pages 241-263, July.
    2. Youness Manzali & Mohamed Elfar, 2023. "Random Forest Pruning Techniques: A Recent Review," SN Operations Research Forum, Springer, vol. 4(2), pages 1-14, June.
    3. Muhammed-Fatih Kaya, 2022. "Pattern Labelling of Business Communication Data," Group Decision and Negotiation, Springer, vol. 31(6), pages 1203-1234, December.
    4. Tiffany Elsten & Mark Rooij, 2022. "SUBiNN: a stacked uni- and bivariate kNN sparse ensemble," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 16(4), pages 847-874, December.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Liangyuan Hu & Lihua Li, 2022. "Using Tree-Based Machine Learning for Health Studies: Literature Review and Case Series," IJERPH, MDPI, vol. 19(23), pages 1-13, December.
    2. Weijun Wang & Dan Zhao & Liguo Fan & Yulong Jia, 2019. "Study on Icing Prediction of Power Transmission Lines Based on Ensemble Empirical Mode Decomposition and Feature Selection Optimized Extreme Learning Machine," Energies, MDPI, vol. 12(11), pages 1-21, June.
    3. Bergsma, Wicher P, 2020. "Regression with I-priors," Econometrics and Statistics, Elsevier, vol. 14(C), pages 89-111.
    4. Silke Janitza & Ender Celik & Anne-Laure Boulesteix, 2018. "A computationally fast variable importance test for random forests for high-dimensional data," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 12(4), pages 885-915, December.
    5. Lkhagvadorj Munkhdalai & Tsendsuren Munkhdalai & Oyun-Erdene Namsrai & Jong Yun Lee & Keun Ho Ryu, 2019. "An Empirical Comparison of Machine-Learning Methods on Bank Client Credit Assessments," Sustainability, MDPI, vol. 11(3), pages 1-23, January.
    6. Fuli Zhang & Kung‐Sik Chan, 2023. "Random projection ensemble classification with high‐dimensional time series," Biometrics, The International Biometric Society, vol. 79(2), pages 964-974, June.
    7. Bergsma, Wicher, 2020. "Regression with I-priors," LSE Research Online Documents on Economics 102136, London School of Economics and Political Science, LSE Library.
    8. Jiang, Qing & Hušková, Marie & Meintanis, Simos G. & Zhu, Lixing, 2019. "Asymptotics, finite-sample comparisons and applications for two-sample tests with functional data," Journal of Multivariate Analysis, Elsevier, vol. 170(C), pages 202-220.
    9. Jin Li & Maggie Tran & Justy Siwabessy, 2016. "Selecting Optimal Random Forest Predictive Models: A Case Study on Predicting the Spatial Distribution of Seabed Hardness," PLOS ONE, Public Library of Science, vol. 11(2), pages 1-29, February.
    10. Abellán, Joaquín & Baker, Rebecca M. & Coolen, Frank P.A. & Crossman, Richard J. & Masegosa, Andrés R., 2014. "Classification with decision trees from a nonparametric predictive inference perspective," Computational Statistics & Data Analysis, Elsevier, vol. 71(C), pages 789-802.
    11. Deepak Nag Ayyala & Santu Ghosh & Daniel F. Linder, 2022. "Covariance matrix testing in high dimension using random projections," Computational Statistics, Springer, vol. 37(3), pages 1111-1141, July.
    12. Saurabh Saxena & Darius Roman & Valentin Robu & David Flynn & Michael Pecht, 2021. "Battery Stress Factor Ranking for Accelerated Degradation Test Planning Using Machine Learning," Energies, MDPI, vol. 14(3), pages 1-17, January.
    13. Fellinghauer, Bernd & Bühlmann, Peter & Ryffel, Martin & von Rhein, Michael & Reinhardt, Jan D., 2013. "Stable graphical model estimation with Random Forests for discrete, continuous, and mixed variables," Computational Statistics & Data Analysis, Elsevier, vol. 64(C), pages 132-152.
    14. Bryan Keller, 2020. "Variable Selection for Causal Effect Estimation: Nonparametric Conditional Independence Testing With Random Forests," Journal of Educational and Behavioral Statistics, , vol. 45(2), pages 119-142, April.
    15. Hermel Homburger & Manuel K Schneider & Sandra Hilfiker & Andreas Lüscher, 2014. "Inferring Behavioral States of Grazing Livestock from High-Frequency Position Data Alone," PLOS ONE, Public Library of Science, vol. 9(12), pages 1-22, December.
    16. Yatracos, Yannis G., 2018. "Residual'S Influence Index (Rinfin), Bad Leverage And Unmasking In High Dimensional L2-Regression," IRTG 1792 Discussion Papers 2018-060, Humboldt University of Berlin, International Research Training Group 1792 "High Dimensional Nonstationary Time Series".
    17. Ingrida Vaiciulyte & Zivile Kalsyte & Leonidas Sakalauskas & Darius Plikynas, 2017. "Assessment of market reaction on the share performance on the basis of its visualization in 2D space," Journal of Business Economics and Management, Taylor & Francis Journals, vol. 18(2), pages 309-318, March.
    18. Polasek, Tomas & Čadík, Martin, 2023. "Predicting photovoltaic power production using high-uncertainty weather forecasts," Applied Energy, Elsevier, vol. 339(C).
    19. Hapfelmeier, Alexander & Hornung, Roman & Haller, Bernhard, 2023. "Efficient permutation testing of variable importance measures by the example of random forests," Computational Statistics & Data Analysis, Elsevier, vol. 181(C).
    20. Dogah, Kingsley E. & Premaratne, Gamini, 2018. "Sectoral exposure of financial markets to oil risk factors in BRICS countries," Energy Economics, Elsevier, vol. 76(C), pages 228-256.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:advdac:v:14:y:2020:i:1:d:10.1007_s11634-019-00364-9. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.