IDEAS home Printed from https://ideas.repec.org/a/gam/jmathe/v13y2025i4p587-d1588218.html
   My bibliography  Save this article

Foundations and Innovations in Data Fusion and Ensemble Learning for Effective Consensus

Author

Listed:
  • Ke-Lin Du

    (School of Mechanical and Electrical Engineering, Guangdong University of Science and Technology, Dongguan 523668, China)

  • Rengong Zhang

    (Zhejiang Yugong Information Technology Co., Ltd., Changhe Road 475, Hangzhou 310002, China)

  • Bingchun Jiang

    (School of Mechanical and Electrical Engineering, Guangdong University of Science and Technology, Dongguan 523668, China)

  • Jie Zeng

    (Shenzhen Feng Xing Tai Bao Technology Co., Ltd., Shenzhen 518063, China)

  • Jiabin Lu

    (Faculty of Electromechanical Engineering, Guangdong University of Technology, Guangzhou 510006, China)

Abstract

Ensemble learning and data fusion techniques play a crucial role in modern machine learning, enhancing predictive performance, robustness, and generalization. This paper provides a comprehensive survey of ensemble methods, covering foundational techniques such as bagging, boosting, and random forests, as well as advanced topics including multiclass classification, multiview learning, multiple kernel learning, and the Dempster–Shafer theory of evidence. We present a comparative analysis of ensemble learning and deep learning, highlighting their respective strengths, limitations, and synergies. Additionally, we examine the theoretical foundations of ensemble methods, including bias–variance trade-offs, margin theory, and optimization-based frameworks, while analyzing computational trade-offs related to training complexity, inference efficiency, and storage requirements. To enhance accessibility, we provide a structured comparative summary of key ensemble techniques. Furthermore, we discuss emerging research directions, such as adaptive ensemble methods, hybrid deep learning approaches, and multimodal data fusion, as well as challenges related to interpretability, model selection, and handling noisy data in high-stakes applications. By integrating theoretical insights with practical considerations, this survey serves as a valuable resource for researchers and practitioners seeking to understand the evolving landscape of ensemble learning and its future prospects.

Suggested Citation

  • Ke-Lin Du & Rengong Zhang & Bingchun Jiang & Jie Zeng & Jiabin Lu, 2025. "Foundations and Innovations in Data Fusion and Ensemble Learning for Effective Consensus," Mathematics, MDPI, vol. 13(4), pages 1-49, February.
  • Handle: RePEc:gam:jmathe:v:13:y:2025:i:4:p:587-:d:1588218
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2227-7390/13/4/587/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2227-7390/13/4/587/
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Ke-Lin Du & M. N. S. Swamy & Zhang-Quan Wang & Wai Ho Mow, 2023. "Matrix Factorization Techniques in Machine Learning, Signal Processing, and Statistics," Mathematics, MDPI, vol. 11(12), pages 1-50, June.
    2. Minerva Mukhopadhyay & David B. Dunson, 2020. "Targeted Random Projection for Prediction From High-Dimensional Features," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 115(532), pages 1998-2010, December.
    3. Paul Horst, 1961. "Relations amongm sets of measures," Psychometrika, Springer;The Psychometric Society, vol. 26(2), pages 129-149, June.
    4. Blaser, Rico & Fryzlewicz, Piotr, 2021. "Regularizing axis-aligned ensembles via data rotations that favor simpler learners," LSE Research Online Documents on Economics 107935, London School of Economics and Political Science, LSE Library.
    5. Lin, Yi & Jeon, Yongho, 2006. "Random Forests and Adaptive Nearest Neighbors," Journal of the American Statistical Association, American Statistical Association, vol. 101, pages 578-590, June.
    6. Blaser, Rico & Fryzlewicz, Piotr, 2016. "Random rotation ensembles," LSE Research Online Documents on Economics 62182, London School of Economics and Political Science, LSE Library.
    7. Peter Hall & Andrew P. Robinson, 2009. "Reducing variability of crossvalidation for smoothing-parameter choice," Biometrika, Biometrika Trust, vol. 96(1), pages 175-186.
    8. Sexton, Joseph & Laake, Petter, 2009. "Standard errors for bagged and random forest estimators," Computational Statistics & Data Analysis, Elsevier, vol. 53(3), pages 801-811, January.
    9. Jing Lei & Larry Wasserman, 2014. "Distribution-free prediction bands for non-parametric regression," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 76(1), pages 71-96, January.
    10. Jaouad Mourtada & Stéphane Gaïffas & Erwan Scornet, 2021. "AMF: Aggregated Mondrian forests for online learning," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 83(3), pages 505-533, July.
    11. Yoonsuh Jung & Jianhua Hu, 2015. "A K -fold averaging cross-validation procedure," Journal of Nonparametric Statistics, Taylor & Francis Journals, vol. 27(2), pages 167-179, June.
    12. Ke-Lin Du & Bingchun Jiang & Jiabin Lu & Jingyu Hua & M. N. S. Swamy, 2024. "Exploring Kernel Machines and Support Vector Machines: Principles, Techniques, and Future Directions," Mathematics, MDPI, vol. 12(24), pages 1-58, December.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. David M. Ritzwoller & Vasilis Syrgkanis, 2024. "Simultaneous Inference for Local Structural Parameters with Random Forests," Papers 2405.07860, arXiv.org, revised Sep 2024.
    2. Susan Athey & Julie Tibshirani & Stefan Wager, 2016. "Generalized Random Forests," Papers 1610.01271, arXiv.org, revised Apr 2018.
    3. Ke-Lin Du & Rengong Zhang & Bingchun Jiang & Jie Zeng & Jiabin Lu, 2025. "Understanding Machine Learning Principles: Learning, Inference, Generalization, and Computational Learning Theory," Mathematics, MDPI, vol. 13(3), pages 1-56, January.
    4. Lei Bill Wang & Zhenbang Jiao & Fangyi Wang, 2025. "Policy-Oriented Binary Classification: Improving (KD-)CART Final Splits for Subpopulation Targeting," Papers 2502.15072, arXiv.org, revised Oct 2025.
    5. Wang, Qing & Chen, Shiwen, 2015. "A general class of linearly extrapolated variance estimators," Statistics & Probability Letters, Elsevier, vol. 98(C), pages 29-38.
    6. Patrick Krennmair & Timo Schmid, 2022. "Flexible domain prediction using mixed effects random forests," Journal of the Royal Statistical Society Series C, Royal Statistical Society, vol. 71(5), pages 1865-1894, November.
    7. Hanafi, Mohamed & Kiers, Henk A.L., 2006. "Analysis of K sets of data, with differential emphasis on agreement between and within sets," Computational Statistics & Data Analysis, Elsevier, vol. 51(3), pages 1491-1508, December.
    8. Pietro Amenta & Antonio Lucadamo & Antonello D’Ambra, 2021. "Restricted Common Component and Specific Weight Analysis: A Constrained Explorative Approach for the Customer Satisfaction Evaluation," Social Indicators Research: An International and Interdisciplinary Journal for Quality-of-Life Measurement, Springer, vol. 156(2), pages 409-427, August.
    9. Goldstein Benjamin A & Polley Eric C & Briggs Farren B. S., 2011. "Random Forests for Genetic Association Studies," Statistical Applications in Genetics and Molecular Biology, De Gruyter, vol. 10(1), pages 1-34, July.
    10. Biewen, Martin & Kugler, Philipp, 2021. "Two-stage least squares random forests with an application to Angrist and Evans (1998)," Economics Letters, Elsevier, vol. 204(C).
    11. Borup, Daniel & Christensen, Bent Jesper & Mühlbach, Nicolaj Søndergaard & Nielsen, Mikkel Slot, 2023. "Targeting predictors in random forest regression," International Journal of Forecasting, Elsevier, vol. 39(2), pages 841-868.
    12. Nayiri Galestian Pour & Soudabeh Shemehsavar, 2024. "Learning from high dimensional data based on weighted feature importance in decision tree ensembles," Computational Statistics, Springer, vol. 39(1), pages 313-342, February.
    13. Lei-Hong Zhang & Li-Zhi Liao & Li-Ming Sun, 2011. "Towards the global solution of the maximal correlation problem," Journal of Global Optimization, Springer, vol. 49(1), pages 91-107, January.
    14. Jerinsh Jeyapaulraj & Dhruv Desai & Peter Chu & Dhagash Mehta & Stefano Pasquali & Philip Sommer, 2022. "Supervised similarity learning for corporate bonds using Random Forest proximities," Papers 2207.04368, arXiv.org, revised Oct 2022.
    15. László Györfi & Tamás Linder & Harro Walk, 2025. "Distribution-free tests for lossless feature selection in classification and regression," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 34(1), pages 262-287, March.
    16. repec:plo:pone00:0191435 is not listed on IDEAS
    17. Sexton, Joseph & Laake, Petter, 2009. "Standard errors for bagged and random forest estimators," Computational Statistics & Data Analysis, Elsevier, vol. 53(3), pages 801-811, January.
    18. Stan Lipovetsky, 2022. "Canonical Concordance Correlation Analysis," Mathematics, MDPI, vol. 11(1), pages 1-12, December.
    19. Chu, Chi-Yang & Henderson, Daniel J. & Parmeter, Christopher F., 2017. "On discrete Epanechnikov kernel functions," Computational Statistics & Data Analysis, Elsevier, vol. 116(C), pages 79-105.
    20. Walter Kristof, 1967. "Orthogonal inter-battery factor analysis," Psychometrika, Springer;The Psychometric Society, vol. 32(2), pages 199-227, June.
    21. Victor Chernozhukov & Kaspar Wuthrich & Yinchu Zhu, 2019. "Distributional conformal prediction," Papers 1909.07889, arXiv.org, revised Aug 2021.

    More about this item

    Keywords

    ;
    ;
    ;
    ;
    ;
    ;

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jmathe:v:13:y:2025:i:4:p:587-:d:1588218. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.