Advanced Search
MyIDEAS: Login

Sparsity and smoothness via the fused lasso

Contents:

Author Info

  • Robert Tibshirani
  • Michael Saunders
  • Saharon Rosset
  • Ji Zhu
  • Keith Knight
Registered author(s):

    Abstract

    The lasso penalizes a least squares regression by the sum of the absolute values ("L" 1-norm) of the coefficients. The form of this penalty encourages sparse solutions (with many coefficients equal to 0). We propose the 'fused lasso', a generalization that is designed for problems with features that can be ordered in some meaningful way. The fused lasso penalizes the "L" 1-norm of both the coefficients and their successive differences. Thus it encourages sparsity of the coefficients and also sparsity of their differences-i.e. local constancy of the coefficient profile. The fused lasso is especially useful when the number of features "p" is much greater than "N", the sample size. The technique is also extended to the 'hinge' loss function that underlies the support vector classifier. We illustrate the methods on examples from protein mass spectroscopy and gene expression data. Copyright 2005 Royal Statistical Society.

    Download Info

    If you experience problems downloading a file, check if you have the proper application to view it first. In case of further problems read the IDEAS help page. Note that these files are not on the IDEAS site. Please be patient as the files may be large.
    File URL: http://www.blackwell-synergy.com/doi/abs/10.1111/j.1467-9868.2005.00490.x
    File Function: link to full text
    Download Restriction: Access to full text is restricted to subscribers.

    As the access to this document is restricted, you may want to look for a different version under "Related research" (further below) or search for a different version of it.

    Bibliographic Info

    Article provided by Royal Statistical Society in its journal Journal of the Royal Statistical Society Series B.

    Volume (Year): 67 (2005)
    Issue (Month): 1 ()
    Pages: 91-108

    as in new window
    Handle: RePEc:bla:jorssb:v:67:y:2005:i:1:p:91-108

    Contact details of provider:
    Postal: 12 Errol Street, London EC1Y 8LX, United Kingdom
    Phone: -44-171-638-8998
    Fax: -44-171-256-7598
    Email:
    Web page: http://wileyonlinelibrary.com/journal/rssb
    More information through EDIRC

    Order Information:
    Web: http://ordering.onlinelibrary.wiley.com/subs.asp?ref=1467-9868&doi=10.1111/(ISSN)1467-9868

    Related research

    Keywords:

    References

    No references listed on IDEAS
    You can help add them by filling out this form.

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as in new window

    Cited by:
    1. Nott, David J., 2008. "Predictive performance of Dirichlet process shrinkage methods in linear regression," Computational Statistics & Data Analysis, Elsevier, vol. 52(7), pages 3658-3669, March.
    2. Martin-Barragan, Belen & Lillo, Rosa & Romo, Juan, 2014. "Interpretable support vector machines for functional data," European Journal of Operational Research, Elsevier, vol. 232(1), pages 146-155.
    3. Hess, Wolfgang & Persson, Maria & Rubenbauer, Stephanie & Gertheiss, Jan, 2013. "Using Lasso-Type Penalties to Model Time-Varying Covariate Effects in Panel Data Regressions – A Novel Approach Illustrated by the ‘Death of Distance’ in International Trade," Working Paper Series 961, Research Institute of Industrial Economics.
    4. Kato, Kengo, 2009. "On the degrees of freedom in shrinkage estimation," Journal of Multivariate Analysis, Elsevier, vol. 100(7), pages 1338-1352, August.
    5. Bang, Sungwan & Jhun, Myoungshic, 2012. "Simultaneous estimation and factor selection in quantile regression via adaptive sup-norm regularization," Computational Statistics & Data Analysis, Elsevier, vol. 56(4), pages 813-826.
    6. Aytug, Haldun & Sayın, Serpil, 2012. "Exploring the trade-off between generalization and empirical errors in a one-norm SVM," European Journal of Operational Research, Elsevier, vol. 218(3), pages 667-675.
    7. Deren Han & Xiaoming Yuan & Wenxing Zhang & Xingju Cai, 2013. "An ADM-based splitting method for separable convex programming," Computational Optimization and Applications, Springer, vol. 54(2), pages 343-369, March.
    8. Korobilis, Dimitris, 2011. "Hierarchical shrinkage priors for dynamic regressions with many predictors," MPRA Paper 30380, University Library of Munich, Germany.
    9. Baragatti, M. & Pommeret, D., 2012. "A study of variable selection using g-prior distribution with ridge parameter," Computational Statistics & Data Analysis, Elsevier, vol. 56(6), pages 1920-1934.
    10. Korzeń, M. & Jaroszewicz, S. & Klęsk, P., 2013. "Logistic regression with weight grouping priors," Computational Statistics & Data Analysis, Elsevier, vol. 64(C), pages 281-298.
    11. Daye, Z. John & Jeng, X. Jessie, 2009. "Shrinkage and model selection with correlated variables via weighted fusion," Computational Statistics & Data Analysis, Elsevier, vol. 53(4), pages 1284-1298, February.
    12. Jiang, Liewen & Bondell, Howard D. & Wang, Huixia Judy, 2014. "Interquantile shrinkage and variable selection in quantile regression," Computational Statistics & Data Analysis, Elsevier, vol. 69(C), pages 208-219.
    13. Mkhadri, Abdallah & Ouhourane, Mohamed, 2013. "An extended variable inclusion and shrinkage algorithm for correlated variables," Computational Statistics & Data Analysis, Elsevier, vol. 57(1), pages 631-644.
    14. Hong, Zhaoping & Lian, Heng, 2013. "Sparse-smooth regularized singular value decomposition," Journal of Multivariate Analysis, Elsevier, vol. 117(C), pages 163-174.
    15. Lichun Wang & Yuan You & Heng Lian, 2013. "A simple and efficient algorithm for fused lasso signal approximator with convex loss function," Computational Statistics, Springer, vol. 28(4), pages 1699-1714, August.
    16. Ye, Gui-Bo & Xie, Xiaohui, 2011. "Split Bregman method for large scale fused Lasso," Computational Statistics & Data Analysis, Elsevier, vol. 55(4), pages 1552-1569, April.

    Lists

    This item is not listed on Wikipedia, on a reading list or among the top items on IDEAS.

    Statistics

    Access and download statistics

    Corrections

    When requesting a correction, please mention this item's handle: RePEc:bla:jorssb:v:67:y:2005:i:1:p:91-108. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Wiley-Blackwell Digital Licensing) or (Christopher F. Baum).

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If references are entirely missing, you can add them using this form.

    If the full references list an item that is present in RePEc, but the system did not link to it, you can help with this form.

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your profile, as there may be some citations waiting for confirmation.

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.