IDEAS home Printed from
MyIDEAS: Log in (now much improved!) to save this article

An extended variable inclusion and shrinkage algorithm for correlated variables

Listed author(s):
  • Mkhadri, Abdallah
  • Ouhourane, Mohamed
Registered author(s):

    The problem of variable selection for linear regression in a high dimension model is considered. A new method, called Extended-VISA (Ext-VISA), is proposed to simultaneously select variables and encourage a grouping effect where strongly correlated predictors tend to be in or out of the model together. Moreover, Ext-VISA is capable of selecting a sparse model while avoiding the overshrinkage of a Lasso-type estimator. It combines the idea of the VISA algorithm which avoids the overshrinkage problem of regression coefficients and those of the Lasso-type estimators, based on ℓ1+ℓ2 penalty, that overcome the limitation of the grouping effect of Lasso in high dimension. It is based on a modified VISA algorithm, so it is also computationally efficient. Three interesting cases of Ext-VISA are examined. The first case is Smooth-VISA (SVISA), where the variations among successive regression coefficients are low. The second case is VISA-Net (VNET), where the correlations between predictors are taken into account. The third case is Laplacian-VISA (LVISA), where the predictors are measured on an undirected graph. A theoretical property on sparsity inequality of Ext-VISA is established. A detailed simulation study in small and high dimensional settings is performed, which illustrates the advantages of the new approach in relation to several other possible methods. Finally, we apply VNET, SVISA and LVISA to a GC-retention data set.

    If you experience problems downloading a file, check if you have the proper application to view it first. In case of further problems read the IDEAS help page. Note that these files are not on the IDEAS site. Please be patient as the files may be large.

    File URL:
    Download Restriction: Full text for ScienceDirect subscribers only.

    As the access to this document is restricted, you may want to look for a different version under "Related research" (further below) or search for a different version of it.

    Article provided by Elsevier in its journal Computational Statistics & Data Analysis.

    Volume (Year): 57 (2013)
    Issue (Month): 1 ()
    Pages: 631-644

    in new window

    Handle: RePEc:eee:csdana:v:57:y:2013:i:1:p:631-644
    DOI: 10.1016/j.csda.2012.07.023
    Contact details of provider: Web page:

    References listed on IDEAS
    Please report citation or reference errors to , or , if you are the registered author of the cited work, log in to your RePEc Author Service profile, click on "citations" and make appropriate adjustments.:

    in new window

    1. Zou, Hui, 2006. "The Adaptive Lasso and Its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 101, pages 1418-1429, December.
    2. Meinshausen, Nicolai, 2007. "Relaxed Lasso," Computational Statistics & Data Analysis, Elsevier, vol. 52(1), pages 374-393, September.
    3. Friedman, Jerome H. & Hastie, Trevor & Tibshirani, Rob, 2010. "Regularization Paths for Generalized Linear Models via Coordinate Descent," Journal of Statistical Software, Foundation for Open Access Statistics, vol. 33(i01).
    4. Gareth M. James & Peter Radchenko & Jinchi Lv, 2009. "DASSO: connections between the Dantzig selector and lasso," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 71(1), pages 127-142.
    5. Gareth M. James & Peter Radchenko, 2009. "A generalized Dantzig selector with shrinkage tuning," Biometrika, Biometrika Trust, vol. 96(2), pages 323-337.
    6. Robert Tibshirani & Michael Saunders & Saharon Rosset & Ji Zhu & Keith Knight, 2005. "Sparsity and smoothness via the fused lasso," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 67(1), pages 91-108.
    7. Hui Zou & Trevor Hastie, 2005. "Regularization and variable selection via the elastic net," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 67(2), pages 301-320.
    8. Daye, Z. John & Jeng, X. Jessie, 2009. "Shrinkage and model selection with correlated variables via weighted fusion," Computational Statistics & Data Analysis, Elsevier, vol. 53(4), pages 1284-1298, February.
    Full references (including those not matched with items on IDEAS)

    This item is not listed on Wikipedia, on a reading list or among the top items on IDEAS.

    When requesting a correction, please mention this item's handle: RePEc:eee:csdana:v:57:y:2013:i:1:p:631-644. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Dana Niculescu)

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If references are entirely missing, you can add them using this form.

    If the full references list an item that is present in RePEc, but the system did not link to it, you can help with this form.

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your profile, as there may be some citations waiting for confirmation.

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    This information is provided to you by IDEAS at the Research Division of the Federal Reserve Bank of St. Louis using RePEc data.