Predictive performance of Dirichlet process shrinkage methods in linear regression
An obvious Bayesian nonparametric generalization of ridge regression assumes that coefficients are exchangeable, from a prior distribution of unknown form, which is given a Dirichlet process prior with a normal base measure. The purpose of this paper is to explore predictive performance of this generalization, which does not seem to have received any detailed attention, despite related applications of the Dirichlet process for shrinkage estimation in multivariate normal means, analysis of randomized block experiments and nonparametric extensions of random effects models in longitudinal data analysis. We consider issues of prior specification and computation, as well as applications in penalized spline smoothing. With a normal base measure in the Dirichlet process and letting the precision parameter approach infinity the procedure is equivalent to ridge regression, whereas for finite values of the precision parameter the discreteness of the Dirichlet process means that some predictors can be estimated as having the same coefficient. Estimating the precision parameter from the data gives a flexible method for shrinkage estimation of mean parameters which can work well when ridge regression does, but also adapts well to sparse situations. We compare our approach with ridge regression, the lasso and the recently proposed elastic net in simulation studies and also consider applications to penalized spline smoothing.
References listed on IDEAS
Please report citation or reference errors to , or , if you are the registered author of the cited work, log in to your RePEc Author Service profile, click on "citations" and make appropriate adjustments.:
- Robert Tibshirani & Michael Saunders & Saharon Rosset & Ji Zhu & Keith Knight, 2005. "Sparsity and smoothness via the fused lasso," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 67(1), pages 91-108.
- Ruppert,David & Wand,M. P. & Carroll,R. J., 2003. "Semiparametric Regression," Cambridge Books, Cambridge University Press, number 9780521785167, November.
- Smith, Michael & Kohn, Robert, 1996.
"Nonparametric regression using Bayesian variable selection,"
Journal of Econometrics,
Elsevier, vol. 75(2), pages 317-343, December.
- Smith, M. & Kohn, R., "undated". "Nonparametric Regression using Bayesian Variable Selection," Statistics Working Paper _009, Australian Graduate School of Management.
- Hui Zou & Trevor Hastie, 2005. "Addendum: Regularization and variable selection via the elastic net," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 67(5), pages 768-768.
- Ruppert,David & Wand,M. P. & Carroll,R. J., 2003. "Semiparametric Regression," Cambridge Books, Cambridge University Press, number 9780521780506, November.
- Hui Zou & Trevor Hastie, 2005. "Regularization and variable selection via the elastic net," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 67(2), pages 301-320. Full references (including those not matched with items on IDEAS)
When requesting a correction, please mention this item's handle: RePEc:eee:csdana:v:52:y:2008:i:7:p:3658-3669. See general information about how to correct material in RePEc.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Dana Niculescu)
If references are entirely missing, you can add them using this form.