Advanced Search
MyIDEAS: Login to save this paper or follow this series

Ensemble classification based on generalized additive models

Contents:

Author Info

  • De Bock, Koen W

    ()
    (Ghent University, Faculty of Economics and Business Administration, Department of Marketing, B-9000 Ghent, Belgium)

  • Coussement, Kristof

    ()
    (Hogeschool-Universiteit Brussel (HUB), Belgium; IESEG School of Management, Universit Catholique de Lille, F-59000 Lille, France)

  • Van den Poel, Dirk

    ()
    (Ghent University, Faculty of Economics and Business Administration, Department of Marketing, B-9000 Ghent, Belgium)

Abstract

Generalized additive models (GAMs) are a generalization of generalized linear models (GLMs) and constitute a powerful technique which has successfully proven its ability to capture nonlinear relationships between explanatory variables and a response variable in many domains. In this paper, GAMs are proposed as base classifiers for ensemble learning. Three alternative ensemble strategies for binary classification using GAMs as base classifiers are proposed: (i) GAMbag based on Bagging, (ii) GAMrsm based on the Random Subspace Method (RSM), and (iii) GAMens as a combination of both. In an experimental validation performed on 12 data sets from the UCI repository, the proposed algorithms are benchmarked to a single GAM and to decision tree based ensemble classifiers (i.e. RSM, Bagging, Random Forest, and the recently proposed Rotation Forest). From the results a number of conclusions can be drawn. Firstly, the use of an ensemble of GAMs instead of a single GAM always leads to improved prediction performance. Secondly, GAMrsm and GAMens perform comparably, while both versions outperform GAMbag. Finally, the value of using GAMs as base classifiers in an ensemble instead of standard decision trees is demonstrated. GAMbag demonstrates comparable performance to ordinary Bagging. Moreover, GAMrsm and GAMens outperform RSM and Bagging, while these two GAM ensemble variations perform comparably to Random Forest and Rotation Forest. Sensitivity analyses are included for the number of member classifiers in the ensemble, the number of variables included in a random feature subspace and the number of degrees of freedom for GAM spline estimation.

Download Info

If you experience problems downloading a file, check if you have the proper application to view it first. In case of further problems read the IDEAS help page. Note that these files are not on the IDEAS site. Please be patient as the files may be large.
File URL: https://lirias.hubrussel.be/bitstream/123456789/3656/1/10HRP02.pdf
Download Restriction: no

Bibliographic Info

Paper provided by Hogeschool-Universiteit Brussel, Faculteit Economie en Management in its series Working Papers with number 2010/02.

as in new window
Length: 30 page
Date of creation: Feb 2010
Date of revision:
Handle: RePEc:hub:wpecon:201002

Contact details of provider:
Web page: http://research.hubrussel.be
More information through EDIRC

Related research

Keywords: Data mining; Classification; Ensemble learning; GAM; UCI;

Other versions of this item:

References

References listed on IDEAS
Please report citation or reference errors to , or , if you are the registered author of the cited work, log in to your RePEc Author Service profile, click on "citations" and make appropriate adjustments.:
as in new window
  1. A. Prinzie & D. Van Den Poel, 2007. "Random Forrests for Multiclass classification: Random Multinomial Logit," Working Papers of Faculty of Economics and Business Administration, Ghent University, Belgium 07/435, Ghent University, Faculty of Economics and Business Administration.
  2. Zwane, E. N. & van der Heijden, P. G. M., 2004. "Semiparametric models for capture-recapture studies with covariates," Computational Statistics & Data Analysis, Elsevier, vol. 47(4), pages 729-743, November.
  3. Archer, Kellie J. & Kimes, Ryan V., 2008. "Empirical characterization of random forest variable importance measures," Computational Statistics & Data Analysis, Elsevier, vol. 52(4), pages 2249-2260, January.
  4. Marx, Brian D. & Eilers, Paul H. C., 1998. "Direct generalized additive modeling with penalized likelihood," Computational Statistics & Data Analysis, Elsevier, vol. 28(2), pages 193-209, August.
  5. Hothorn, Torsten & Lausen, Berthold, 2005. "Bundling classifiers by bagging trees," Computational Statistics & Data Analysis, Elsevier, vol. 49(4), pages 1068-1078, June.
  6. Croux, Christophe & Joossens, Kristel & Lemmens, Aurelie, 2007. "Trimmed bagging," Computational Statistics & Data Analysis, Elsevier, vol. 52(1), pages 362-368, September.
  7. Borra, Simone & Di Ciaccio, Agostino, 2002. "Improving nonparametric regression methods by bagging and boosting," Computational Statistics & Data Analysis, Elsevier, vol. 38(4), pages 407-420, February.
  8. Baccini, Michela & Biggeri, Annibale & Lagazio, Corrado & Lertxundi, Aitana & Saez, Marc, 2007. "Parametric and semi-parametric approaches in the analysis of short-term effects of air pollution on health," Computational Statistics & Data Analysis, Elsevier, vol. 51(9), pages 4324-4336, May.
Full references (including those not matched with items on IDEAS)

Citations

Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
as in new window

Cited by:
  1. Adler, Werner & Brenning, Alexander & Potapov, Sergej & Schmid, Matthias & Lausen, Berthold, 2011. "Ensemble classification of paired data," Computational Statistics & Data Analysis, Elsevier, vol. 55(5), pages 1933-1941, May.
  2. K. W. De Bock & D. Van Den Poel, 2012. "Reconciling Performance and Interpretability in Customer Churn Prediction using Ensemble Learning based on Generalized Additive Models," Working Papers of Faculty of Economics and Business Administration, Ghent University, Belgium 12/805, Ghent University, Faculty of Economics and Business Administration.
  3. K. W. De Bock & D. Van Den Poel, 2011. "An empirical evaluation of rotation-based ensemble classifiers for customer churn prediction," Working Papers of Faculty of Economics and Business Administration, Ghent University, Belgium 11/717, Ghent University, Faculty of Economics and Business Administration.
  4. Christmann, Andreas & Hable, Robert, 2012. "Consistency of support vector machines using additive kernels for additive models," Computational Statistics & Data Analysis, Elsevier, vol. 56(4), pages 854-873.
  5. Coussement, Kristof & De Bock, Koen W., 2013. "Customer churn prediction in the online gambling industry: The beneficial effect of ensemble learning," Journal of Business Research, Elsevier, vol. 66(9), pages 1629-1636.

Lists

This item is not listed on Wikipedia, on a reading list or among the top items on IDEAS.

Statistics

Access and download statistics

Corrections

When requesting a correction, please mention this item's handle: RePEc:hub:wpecon:201002. See general information about how to correct material in RePEc.

For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Sabine Janssens).

If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

If references are entirely missing, you can add them using this form.

If the full references list an item that is present in RePEc, but the system did not link to it, you can help with this form.

If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your profile, as there may be some citations waiting for confirmation.

Please note that corrections may take a couple of weeks to filter through the various RePEc services.