IDEAS home Printed from https://ideas.repec.org/a/eee/csdana/v45y2004i3p595-607.html
   My bibliography  Save this article

Variable selection bias in regression trees with constant fits

Author

Listed:
  • Shih, Yu-Shan
  • Tsai, Hsin-Wen

Abstract

No abstract is available for this item.

Suggested Citation

  • Shih, Yu-Shan & Tsai, Hsin-Wen, 2004. "Variable selection bias in regression trees with constant fits," Computational Statistics & Data Analysis, Elsevier, vol. 45(3), pages 595-607, April.
  • Handle: RePEc:eee:csdana:v:45:y:2004:i:3:p:595-607
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0167-9473(03)00036-7
    Download Restriction: Full text for ScienceDirect subscribers only.
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. A. J. Scott & M. Knott, 1976. "An Approximate Test for Use with Aid," Journal of the Royal Statistical Society Series C, Royal Statistical Society, vol. 25(2), pages 103-106, June.
    2. Kim H. & Loh W.Y., 2001. "Classification Trees With Unbiased Multiway Splits," Journal of the American Statistical Association, American Statistical Association, vol. 96, pages 589-604, June.
    3. G. V. Kass, 1975. "Significance Testing in Automatic Interaction Detection (A.I.D.)," Journal of the Royal Statistical Society Series C, Royal Statistical Society, vol. 24(2), pages 178-189, June.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Strobl, Carolin & Boulesteix, Anne-Laure & Augustin, Thomas, 2007. "Unbiased split selection for classification trees based on the Gini Index," Computational Statistics & Data Analysis, Elsevier, vol. 52(1), pages 483-501, September.
    2. Wei, Pengfei & Lu, Zhenzhou & Song, Jingwen, 2015. "Variable importance analysis: A comprehensive review," Reliability Engineering and System Safety, Elsevier, vol. 142(C), pages 399-432.
    3. Gerhard Tutz & Moritz Berger, 2016. "Item-focussed Trees for the Identification of Items in Differential Item Functioning," Psychometrika, Springer;The Psychometric Society, vol. 81(3), pages 727-750, September.
    4. Yuan Xu & Huading Shi & Yang Fei & Chao Wang & Li Mo & Mi Shu, 2021. "Identification of Soil Heavy Metal Sources in a Large-Scale Area Affected by Industry," Sustainability, MDPI, vol. 13(2), pages 1-18, January.
    5. Postiglione, Paolo & Benedetti, Roberto & Lafratta, Giovanni, 2010. "A regression tree algorithm for the identification of convergence clubs," Computational Statistics & Data Analysis, Elsevier, vol. 54(11), pages 2776-2785, November.
    6. Xiaogang Su & George Ekow Quaye & Yishu Wei & Joseph Kang & Lei Liu & Qiong Yang & Juanjuan Fan & Richard A. Levine, 2024. "Smooth Sigmoid Surrogate (SSS): An Alternative to Greedy Search in Decision Trees," Mathematics, MDPI, vol. 12(20), pages 1-28, October.
    7. Hapfelmeier, A. & Ulm, K., 2014. "Variable selection by Random Forests using data with missing values," Computational Statistics & Data Analysis, Elsevier, vol. 80(C), pages 129-139.
    8. Alvarez-Iglesias, Alberto & Hinde, John & Ferguson, John & Newell, John, 2017. "An alternative pruning based approach to unbiased recursive partitioning," Computational Statistics & Data Analysis, Elsevier, vol. 106(C), pages 90-102.
    9. S. H. C. M. van Veen & R. C. van Kleef & W. P. M. M. van de Ven & R. C. J. A. van Vliet, 2018. "Exploring the predictive power of interaction terms in a sophisticated risk equalization model using regression trees," Health Economics, John Wiley & Sons, Ltd., vol. 27(2), pages 1-12, February.
    10. Wei-Yin Loh, 2014. "Fifty Years of Classification and Regression Trees," International Statistical Review, International Statistical Institute, vol. 82(3), pages 329-348, December.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Strobl, Carolin & Boulesteix, Anne-Laure & Augustin, Thomas, 2007. "Unbiased split selection for classification trees based on the Gini Index," Computational Statistics & Data Analysis, Elsevier, vol. 52(1), pages 483-501, September.
    2. Ollech, Daniel & Webel, Karsten, 2020. "A random forest-based approach to identifying the most informative seasonality tests," Discussion Papers 55/2020, Deutsche Bundesbank.
    3. Shih, Y. -S., 2004. "A note on split selection bias in classification trees," Computational Statistics & Data Analysis, Elsevier, vol. 45(3), pages 457-466, April.
    4. repec:hum:wpaper:sfb649dp2008-035 is not listed on IDEAS
    5. Hothorn, Torsten & Lausen, Berthold, 2005. "Bundling classifiers by bagging trees," Computational Statistics & Data Analysis, Elsevier, vol. 49(4), pages 1068-1078, June.
    6. Emilio Carrizosa & Cristina Molero-Río & Dolores Romero Morales, 2021. "Mathematical optimization in classification and regression trees," TOP: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 29(1), pages 5-33, April.
    7. Gray, J. Brian & Fan, Guangzhe, 2008. "Classification tree analysis using TARGET," Computational Statistics & Data Analysis, Elsevier, vol. 52(3), pages 1362-1372, January.
    8. Noh, Hyun Gon & Song, Moon Sup & Park, Sung Hyun, 2004. "An unbiased method for constructing multilabel classification trees," Computational Statistics & Data Analysis, Elsevier, vol. 47(1), pages 149-164, August.
    9. Dimitris Bertsimas & Margrét V. Bjarnadóttir & Michael A. Kane & J. Christian Kryder & Rudra Pandey & Santosh Vempala & Grant Wang, 2008. "Algorithmic Prediction of Health-Care Costs," Operations Research, INFORMS, vol. 56(6), pages 1382-1392, December.
    10. Silke Janitza & Ender Celik & Anne-Laure Boulesteix, 2018. "A computationally fast variable importance test for random forests for high-dimensional data," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 12(4), pages 885-915, December.
    11. Lee, Tzu-Haw & Shih, Yu-Shan, 2006. "Unbiased variable selection for classification trees with multivariate responses," Computational Statistics & Data Analysis, Elsevier, vol. 51(2), pages 659-667, November.
    12. Cédric Beaulac & Jeffrey S. Rosenthal, 2019. "Predicting University Students’ Academic Success and Major Using Random Forests," Research in Higher Education, Springer;Association for Institutional Research, vol. 60(7), pages 1048-1064, November.
    13. Andriyashin, Anton, 2008. "Stock picking via nonsymmetrically pruned binary decision trees," SFB 649 Discussion Papers 2008-035, Humboldt University Berlin, Collaborative Research Center 649: Economic Risk.
    14. Gérard Biau & Erwan Scornet, 2016. "A random forest guided tour," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 25(2), pages 197-227, June.
    15. Alvarez-Iglesias, Alberto & Hinde, John & Ferguson, John & Newell, John, 2017. "An alternative pruning based approach to unbiased recursive partitioning," Computational Statistics & Data Analysis, Elsevier, vol. 106(C), pages 90-102.
    16. Christophe Dutang & Quentin Guibert, 2021. "An explicit split point procedure in model-based trees allowing for a quick fitting of GLM trees and GLM forests," Post-Print hal-03448250, HAL.
    17. Sorensen, Kenneth & Janssens, Gerrit K., 2003. "Data mining with genetic algorithms on binary trees," European Journal of Operational Research, Elsevier, vol. 151(2), pages 253-264, December.
    18. Richard Dubes & Guangzhou Zeng, 1987. "A test for spatial homogeneity in cluster analysis," Journal of Classification, Springer;The Classification Society, vol. 4(1), pages 33-56, March.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:csdana:v:45:y:2004:i:3:p:595-607. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/csda .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.