Bias-corrected random forests in regression
It is well known that random forests reduce the variance of the regression predictors compared to a single tree, while leaving the bias unchanged. In many situations, the dominating component in the risk turns out to be the squared bias, which leads to the necessity of bias correction. In this paper, random forests are used to estimate the regression function. Five different methods for estimating bias are proposed and discussed. Simulated and real data are used to study the performance of these methods. Our proposed methods are significantly effective in reducing bias in regression context.
If you experience problems downloading a file, check if you have the proper application to view it first. In case of further problems read the IDEAS help page. Note that these files are not on the IDEAS site. Please be patient as the files may be large.
As the access to this document is restricted, you may want to look for a different version under "Related research" (further below) or search for a different version of it.
Volume (Year): 39 (2012)
Issue (Month): 1 (March)
|Contact details of provider:|| Web page: http://www.tandfonline.com/CJAS20 |
|Order Information:||Web: http://www.tandfonline.com/pricing/journal/CJAS20|
When requesting a correction, please mention this item's handle: RePEc:taf:japsta:v:39:y:2012:i:1:p:151-160. See general information about how to correct material in RePEc.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Michael McNulty)
If references are entirely missing, you can add them using this form.