IDEAS home Printed from https://ideas.repec.org/a/eee/stapro/v119y2016icp91-100.html

An asymptotically optimal kernel combined classifier

Author

Listed:
  • Mojirsheibani, Majid
  • Kong, Jiajie

Abstract

A kernel ensemble classifier is developed for accurate classification based on several initial classifiers. A data-driven choice of the smoothing parameter of the kernel is considered and the resulting classifier is shown to be asymptotically optimal. Therefore, the proposed combined classifier asymptotically outperforms each individual classifier.

Suggested Citation

  • Mojirsheibani, Majid & Kong, Jiajie, 2016. "An asymptotically optimal kernel combined classifier," Statistics & Probability Letters, Elsevier, vol. 119(C), pages 91-100.
  • Handle: RePEc:eee:stapro:v:119:y:2016:i:c:p:91-100
    DOI: 10.1016/j.spl.2016.07.017
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0167715216301304
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.spl.2016.07.017?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to

    for a different version of it.

    References listed on IDEAS

    as
    1. De Bock, Koen W. & Coussement, Kristof & Van den Poel, Dirk, 2010. "Ensemble classification based on generalized additive models," Computational Statistics & Data Analysis, Elsevier, vol. 54(6), pages 1535-1546, June.
    2. Lin, Yi & Jeon, Yongho, 2006. "Random Forests and Adaptive Nearest Neighbors," Journal of the American Statistical Association, American Statistical Association, vol. 101, pages 578-590, June.
    3. van der Laan Mark J. & Polley Eric C & Hubbard Alan E., 2007. "Super Learner," Statistical Applications in Genetics and Molecular Biology, De Gruyter, vol. 6(1), pages 1-23, September.
    4. Rokach, Lior, 2009. "Taxonomy for characterizing ensemble methods in classification tasks: A review and annotated bibliography," Computational Statistics & Data Analysis, Elsevier, vol. 53(12), pages 4046-4072, October.
    5. Narayanaswamy Balakrishnan & Majid Mojirsheibani, 2015. "A simple method for combining estimates to improve the overall error rates in classification," Computational Statistics, Springer, vol. 30(4), pages 1033-1049, December.
    6. Yang, Yuhong, 2000. "Combining Different Procedures for Adaptive Regression," Journal of Multivariate Analysis, Elsevier, vol. 74(1), pages 135-161, July.
    7. Biau, Gérard & Fischer, Aurélie & Guedj, Benjamin & Malley, James D., 2016. "COBRA: A combined regression strategy," Journal of Multivariate Analysis, Elsevier, vol. 146(C), pages 18-28.
    8. Adler, Werner & Brenning, Alexander & Potapov, Sergej & Schmid, Matthias & Lausen, Berthold, 2011. "Ensemble classification of paired data," Computational Statistics & Data Analysis, Elsevier, vol. 55(5), pages 1933-1941, May.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Narayanaswamy Balakrishnan & Majid Mojirsheibani, 2015. "A simple method for combining estimates to improve the overall error rates in classification," Computational Statistics, Springer, vol. 30(4), pages 1033-1049, December.
    2. Jasmit Shah & Somnath Datta & Susmita Datta, 2014. "A multi-loss super regression learner (MSRL) with application to survival prediction using proteomics," Computational Statistics, Springer, vol. 29(6), pages 1749-1767, December.
    3. Hoora Moradian & Denis Larocque & François Bellavance, 2017. "$$L_1$$ L 1 splitting rules in survival forests," Lifetime Data Analysis: An International Journal Devoted to Statistical Methods and Applications for Time-to-Event Data, Springer, vol. 23(4), pages 671-691, October.
    4. Adler, Werner & Brenning, Alexander & Potapov, Sergej & Schmid, Matthias & Lausen, Berthold, 2011. "Ensemble classification of paired data," Computational Statistics & Data Analysis, Elsevier, vol. 55(5), pages 1933-1941, May.
    5. Aryan Bhambu & Arabin Kumar Dey, 2024. "Some variation of COBRA in sequential learning setup," Papers 2405.04539, arXiv.org.
    6. Ke-Lin Du & Rengong Zhang & Bingchun Jiang & Jie Zeng & Jiabin Lu, 2025. "Foundations and Innovations in Data Fusion and Ensemble Learning for Effective Consensus," Mathematics, MDPI, vol. 13(4), pages 1-49, February.
    7. Koen W. de Bock & Arno de Caigny, 2021. "Spline-rule ensemble classifiers with structured sparsity regularization for interpretable customer churn modeling," Post-Print hal-03391564, HAL.
    8. Goldstein Benjamin A & Polley Eric C & Briggs Farren B. S., 2011. "Random Forests for Genetic Association Studies," Statistical Applications in Genetics and Molecular Biology, De Gruyter, vol. 10(1), pages 1-34, July.
    9. Chen, Xiangmeng & Shafizadeh, Alireza & Shahbeik, Hossein & Nadian, Mohammad Hossein & Golvirdizadeh, Milad & Peng, Wanxi & Lam, Su Shiung & Tabatabaei, Meisam & Aghbashlo, Mortaza, 2025. "Enhanced bio-oil production from biomass catalytic pyrolysis using machine learning," Renewable and Sustainable Energy Reviews, Elsevier, vol. 209(C).
    10. Borup, Daniel & Christensen, Bent Jesper & Mühlbach, Nicolaj Søndergaard & Nielsen, Mikkel Slot, 2023. "Targeting predictors in random forest regression," International Journal of Forecasting, Elsevier, vol. 39(2), pages 841-868.
    11. Jerinsh Jeyapaulraj & Dhruv Desai & Peter Chu & Dhagash Mehta & Stefano Pasquali & Philip Sommer, 2022. "Supervised similarity learning for corporate bonds using Random Forest proximities," Papers 2207.04368, arXiv.org, revised Oct 2022.
    12. repec:plo:pone00:0191435 is not listed on IDEAS
    13. Sexton, Joseph & Laake, Petter, 2009. "Standard errors for bagged and random forest estimators," Computational Statistics & Data Analysis, Elsevier, vol. 53(3), pages 801-811, January.
    14. Gruber Susan & van der Laan Mark J., 2010. "A Targeted Maximum Likelihood Estimator of a Causal Effect on a Bounded Continuous Outcome," The International Journal of Biostatistics, De Gruyter, vol. 6(1), pages 1-18, August.
    15. Gruber Susan & van der Laan Mark J., 2010. "An Application of Collaborative Targeted Maximum Likelihood Estimation in Causal Inference and Genomics," The International Journal of Biostatistics, De Gruyter, vol. 6(1), pages 1-31, May.
    16. Chun-Xia Zhang & Jiang-She Zhang & Sang-Woon Kim, 2016. "PBoostGA: pseudo-boosting genetic algorithm for variable ranking and selection," Computational Statistics, Springer, vol. 31(4), pages 1237-1262, December.
    17. Joshua Rosaler & Luca Candelori & Vahagn Kirakosyan & Kharen Musaelian & Ryan Samson & Martin T. Wells & Dhagash Mehta & Stefano Pasquali, 2025. "Supervised Similarity for High-Yield Corporate Bonds with Quantum Cognition Machine Learning," Papers 2502.01495, arXiv.org.
    18. Michele Costola & Bertrand Maillet & Zhining Yuan & Xiang Zhang, 2024. "Mean–variance efficient large portfolios: a simple machine learning heuristic technique based on the two-fund separation theorem," Annals of Operations Research, Springer, vol. 334(1), pages 133-155, March.
    19. Joshua Rosaler & Dhruv Desai & Bhaskarjit Sarmah & Dimitrios Vamvourellis & Deran Onay & Dhagash Mehta & Stefano Pasquali, 2023. "Enhanced Local Explainability and Trust Scores with Random Forest Proximities," Papers 2310.12428, arXiv.org, revised Aug 2024.
    20. Coussement, Kristof & De Bock, Koen W., 2013. "Customer churn prediction in the online gambling industry: The beneficial effect of ensemble learning," Journal of Business Research, Elsevier, vol. 66(9), pages 1629-1636.
    21. David M. Ritzwoller & Vasilis Syrgkanis, 2024. "Simultaneous Inference for Local Structural Parameters with Random Forests," Papers 2405.07860, arXiv.org, revised Sep 2024.

    More about this item

    Keywords

    ;
    ;
    ;
    ;

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:stapro:v:119:y:2016:i:c:p:91-100. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/wps/find/journaldescription.cws_home/622892/description#description .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.