Sparsity and smoothness via the fused lasso
AbstractThe lasso penalizes a least squares regression by the sum of the absolute values ("L" 1-norm) of the coefficients. The form of this penalty encourages sparse solutions (with many coefficients equal to 0). We propose the 'fused lasso', a generalization that is designed for problems with features that can be ordered in some meaningful way. The fused lasso penalizes the "L" 1-norm of both the coefficients and their successive differences. Thus it encourages sparsity of the coefficients and also sparsity of their differences-i.e. local constancy of the coefficient profile. The fused lasso is especially useful when the number of features "p" is much greater than "N", the sample size. The technique is also extended to the 'hinge' loss function that underlies the support vector classifier. We illustrate the methods on examples from protein mass spectroscopy and gene expression data. Copyright 2005 Royal Statistical Society.
Download InfoIf you experience problems downloading a file, check if you have the proper application to view it first. In case of further problems read the IDEAS help page. Note that these files are not on the IDEAS site. Please be patient as the files may be large.
As the access to this document is restricted, you may want to look for a different version under "Related research" (further below) or search for a different version of it.
Bibliographic InfoArticle provided by Royal Statistical Society in its journal Journal of the Royal Statistical Society Series B.
Volume (Year): 67 (2005)
Issue (Month): 1 ()
Contact details of provider:
Postal: 12 Errol Street, London EC1Y 8LX, United Kingdom
Web page: http://www.blackwellpublishing.com/journal.asp?ref=1369-7412
More information through EDIRC
You can help add them by filling out this form.
CitEc Project, subscribe to its RSS feed for this item.
- Aytug, Haldun & Sayın, Serpil, 2012. "Exploring the trade-off between generalization and empirical errors in a one-norm SVM," European Journal of Operational Research, Elsevier, vol. 218(3), pages 667-675.
- Bang, Sungwan & Jhun, Myoungshic, 2012. "Simultaneous estimation and factor selection in quantile regression via adaptive sup-norm regularization," Computational Statistics & Data Analysis, Elsevier, vol. 56(4), pages 813-826.
- Nott, David J., 2008. "Predictive performance of Dirichlet process shrinkage methods in linear regression," Computational Statistics & Data Analysis, Elsevier, vol. 52(7), pages 3658-3669, March.
- Daye, Z. John & Jeng, X. Jessie, 2009. "Shrinkage and model selection with correlated variables via weighted fusion," Computational Statistics & Data Analysis, Elsevier, vol. 53(4), pages 1284-1298, February.
- Dimitris Korobilis, 2011.
"Hierarchical Shrinkage Priors for Dynamic Regressions with Many Predictors,"
Working Paper Series
21_11, The Rimini Centre for Economic Analysis.
- Korobilis, Dimitris, 2013. "Hierarchical shrinkage priors for dynamic regressions with many predictors," International Journal of Forecasting, Elsevier, vol. 29(1), pages 43-59.
- Korobilis, Dimitris, 2011. "Hierarchical shrinkage priors for dynamic regressions with many predictors," MPRA Paper 30380, University Library of Munich, Germany.
- KOROBILIS, Dimitris, 2011. "Hierarchical shrinkage priors for dynamic regressions with many predictors," CORE Discussion Papers 2011021, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
- Ding, Huijuan & Claeskens, Gerda, 2008. "Variable selection in partially linear wavelet models," Open Access publications from Katholieke Universiteit Leuven urn:hdl:123456789/209060, Katholieke Universiteit Leuven.
- Baragatti, M. & Pommeret, D., 2012. "A study of variable selection using g-prior distribution with ridge parameter," Computational Statistics & Data Analysis, Elsevier, vol. 56(6), pages 1920-1934.
- Ye, Gui-Bo & Xie, Xiaohui, 2011. "Split Bregman method for large scale fused Lasso," Computational Statistics & Data Analysis, Elsevier, vol. 55(4), pages 1552-1569, April.
- Mkhadri, Abdallah & Ouhourane, Mohamed, 2013. "An extended variable inclusion and shrinkage algorithm for correlated variables," Computational Statistics & Data Analysis, Elsevier, vol. 57(1), pages 631-644.
- Hess, Wolfgang & Persson, Maria & Rubenbauer, Stephanie & Gertheiss, Jan, 2013.
"Using Lasso-Type Penalties to Model Time-Varying Covariate Effects in Panel Data Regressions - A Novel Approach Illustrated by the 'Death of Distance' in International Trade,"
2013:5, Lund University, Department of Economics.
- Hess, Wolfgang & Persson, Maria & Rubenbauer, Stephanie & Gertheiss, Jan, 2013. "Using Lasso-Type Penalties to Model Time-Varying Covariate Effects in Panel Data Regressions – A Novel Approach Illustrated by the ‘Death of Distance’ in International Trade," Working Paper Series 961, Research Institute of Industrial Economics.
- Kato, Kengo, 2009. "On the degrees of freedom in shrinkage estimation," Journal of Multivariate Analysis, Elsevier, vol. 100(7), pages 1338-1352, August.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Wiley-Blackwell Digital Licensing) or (Christopher F. Baum).
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
If references are entirely missing, you can add them using this form.
If the full references list an item that is present in RePEc, but the system did not link to it, you can help with this form.
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your profile, as there may be some citations waiting for confirmation.
Please note that corrections may take a couple of weeks to filter through the various RePEc services.