IDEAS home Printed from https://ideas.repec.org/a/eee/csdana/v112y2017icp242-256.html

Simultaneous dimension reduction and variable selection in modeling high dimensional data

Author

Listed:
  • Lansangan, Joseph Ryan G.
  • Barrios, Erniel B.

Abstract

High dimensional predictors in regression analysis are often associated with multicollinearity along with other estimation problems. These problems can be mitigated through a constrained optimization method that simultaneously induces dimension reduction and variable selection that also maintains a high level of predictive ability of the fitted model. Simulation studies show that the method may outperform sparse principal component regression, least absolute shrinkage and selection operator, and elastic net procedures in terms of predictive ability and optimal selection of inputs. Furthermore, the method yields reduced models with smaller prediction errors than the estimated full models from the principal component regression or the principal covariance regression.

Suggested Citation

  • Lansangan, Joseph Ryan G. & Barrios, Erniel B., 2017. "Simultaneous dimension reduction and variable selection in modeling high dimensional data," Computational Statistics & Data Analysis, Elsevier, vol. 112(C), pages 242-256.
  • Handle: RePEc:eee:csdana:v:112:y:2017:i:c:p:242-256
    DOI: 10.1016/j.csda.2017.03.015
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0167947317300609
    Download Restriction: Full text for ScienceDirect subscribers only.

    File URL: https://libkey.io/10.1016/j.csda.2017.03.015?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to

    for a different version of it.

    References listed on IDEAS

    as
    1. Hyonho Chun & Sündüz Keleş, 2010. "Sparse partial least squares regression for simultaneous dimension reduction and variable selection," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 72(1), pages 3-25, January.
    2. Artur Klinger, 2001. "Inference in high dimensional generalized linear models based on soft thresholding," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 63(2), pages 377-392.
    3. Reinhold Kosfeld & Jørgen Lauridsen, 2008. "Factor analysis regression," Statistical Papers, Springer, vol. 49(4), pages 653-667, October.
    4. Carl Eckart & Gale Young, 1936. "The approximation of one matrix by another of lower rank," Psychometrika, Springer;The Psychometric Society, vol. 1(3), pages 211-218, September.
    5. Hui Zou & Trevor Hastie, 2005. "Addendum: Regularization and variable selection via the elastic net," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 67(5), pages 768-768, November.
    6. Hugh Chipman & Hong Gu, 2005. "Interpretable dimension reduction," Journal of Applied Statistics, Taylor & Francis Journals, vol. 32(9), pages 969-987.
    7. Izenman, Alan Julian, 1975. "Reduced-rank regression for the multivariate linear model," Journal of Multivariate Analysis, Elsevier, vol. 5(2), pages 248-264, June.
    8. Ian T. Jolliffe, 1982. "A Note on the Use of Principal Components in Regression," Journal of the Royal Statistical Society Series C, Royal Statistical Society, vol. 31(3), pages 300-303, November.
    9. Hui Zou & Trevor Hastie, 2005. "Regularization and variable selection via the elastic net," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 67(2), pages 301-320, April.
    10. Tze‐San Lee, 1987. "Optimum Ridge Parameter Selection," Journal of the Royal Statistical Society Series C, Royal Statistical Society, vol. 36(1), pages 112-118, March.
    11. Lisha Chen & Jianhua Z. Huang, 2012. "Sparse Reduced-Rank Regression for Simultaneous Dimension Reduction and Variable Selection," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 107(500), pages 1533-1545, December.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Adriano Zanin Zambom & Gregory J. Matthews, 2021. "Sure independence screening in the presence of missing data," Statistical Papers, Springer, vol. 62(2), pages 817-845, April.
    2. Xu, Qifa & Zhuo, Xingxuan & Jiang, Cuixia & Liu, Xi & Liu, Yezheng, 2018. "Group penalized unrestricted mixed data sampling model with application to forecasting US GDP growth," Economic Modelling, Elsevier, vol. 75(C), pages 221-236.
    3. Massimo Guidolin & Manuela Pedio, 2022. "Switching Coefficients or Automatic Variable Selection: An Application in Forecasting Commodity Returns," Forecasting, MDPI, vol. 4(1), pages 1-32, February.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Kawano, Shuichi & Fujisawa, Hironori & Takada, Toyoyuki & Shiroishi, Toshihiko, 2015. "Sparse principal component regression with adaptive loading," Computational Statistics & Data Analysis, Elsevier, vol. 89(C), pages 192-203.
    2. Dmitry Kobak & Yves Bernaerts & Marissa A. Weis & Federico Scala & Andreas S. Tolias & Philipp Berens, 2021. "Sparse reduced‐rank regression for exploratory visualisation of paired multivariate data," Journal of the Royal Statistical Society Series C, Royal Statistical Society, vol. 70(4), pages 980-1000, August.
    3. Mishra, Aditya & Dey, Dipak K. & Chen, Yong & Chen, Kun, 2021. "Generalized co-sparse factor regression," Computational Statistics & Data Analysis, Elsevier, vol. 157(C).
    4. Luis A. Barboza & Julien Emile-Geay & Bo Li & Wan He, 2019. "Efficient Reconstructions of Common Era Climate via Integrated Nested Laplace Approximations," Journal of Agricultural, Biological and Environmental Statistics, Springer;The International Biometric Society;American Statistical Association, vol. 24(3), pages 535-554, September.
    5. Bai, Ray & Ghosh, Malay, 2018. "High-dimensional multivariate posterior consistency under global–local shrinkage priors," Journal of Multivariate Analysis, Elsevier, vol. 167(C), pages 157-170.
    6. Shuichi Kawano, 2021. "Sparse principal component regression via singular value decomposition approach," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 15(3), pages 795-823, September.
    7. Stamer, Vincent, 2024. "Thinking outside the container: A sparse partial least squares approach to forecasting trade flows," International Journal of Forecasting, Elsevier, vol. 40(4), pages 1336-1358.
    8. Ali Habibnia & Esfandiar Maasoumi, 2021. "Forecasting in Big Data Environments: An Adaptable and Automated Shrinkage Estimation of Neural Networks (AAShNet)," Journal of Quantitative Economics, Springer;The Indian Econometric Society (TIES), vol. 19(1), pages 363-381, December.
    9. Rosember Guerra-Urzola & Katrijn Van Deun & Juan C. Vera & Klaas Sijtsma, 2021. "A Guide for Sparse PCA: Model Comparison and Applications," Psychometrika, Springer;The Psychometric Society, vol. 86(4), pages 893-919, December.
    10. Kawano, Shuichi & Fujisawa, Hironori & Takada, Toyoyuki & Shiroishi, Toshihiko, 2018. "Sparse principal component regression for generalized linear models," Computational Statistics & Data Analysis, Elsevier, vol. 124(C), pages 180-196.
    11. Anshul Verma & Riccardo Junior Buonocore & Tiziana di Matteo, 2017. "A cluster driven log-volatility factor model: a deepening on the source of the volatility clustering," Papers 1712.02138, arXiv.org, revised May 2018.
    12. Shen, Haipeng & Huang, Jianhua Z., 2008. "Sparse principal component analysis via regularized low rank matrix approximation," Journal of Multivariate Analysis, Elsevier, vol. 99(6), pages 1015-1034, July.
    13. Takumi Saegusa & Tianzhou Ma & Gang Li & Ying Qing Chen & Mei-Ling Ting Lee, 2020. "Variable Selection in Threshold Regression Model with Applications to HIV Drug Adherence Data," Statistics in Biosciences, Springer;International Chinese Statistical Association, vol. 12(3), pages 376-398, December.
    14. Luo, Ruiyan & Qi, Xin, 2015. "Sparse wavelet regression with multiple predictive curves," Journal of Multivariate Analysis, Elsevier, vol. 134(C), pages 33-49.
    15. Michael Greenacre & Patrick J. F Groenen & Trevor Hastie & Alfonso Iodice d’Enza & Angelos Markos & Elena Tuzhilina, 2023. "Principal component analysis," Economics Working Papers 1856, Department of Economics and Business, Universitat Pompeu Fabra.
    16. Yu, Dengdeng & Zhang, Li & Mizera, Ivan & Jiang, Bei & Kong, Linglong, 2019. "Sparse wavelet estimation in quantile regression with multiple functional predictors," Computational Statistics & Data Analysis, Elsevier, vol. 136(C), pages 12-29.
    17. Yagli, Gokhan Mert & Yang, Dazhi & Srinivasan, Dipti, 2019. "Automatic hourly solar forecasting using machine learning models," Renewable and Sustainable Energy Reviews, Elsevier, vol. 105(C), pages 487-498.
    18. Marie Levakova & Susanne Ditlevsen, 2024. "Penalisation Methods in Fitting High‐Dimensional Cointegrated Vector Autoregressive Models: A Review," International Statistical Review, International Statistical Institute, vol. 92(2), pages 160-193, August.
    19. Mihee Lee & Haipeng Shen & Jianhua Z. Huang & J. S. Marron, 2010. "Biclustering via Sparse Singular Value Decomposition," Biometrics, The International Biometric Society, vol. 66(4), pages 1087-1095, December.
    20. Firuz Kamalov & Hana Sulieman & Ayman Alzaatreh & Maher Emarly & Hasna Chamlal & Murodbek Safaraliev, 2025. "Mathematical Methods in Feature Selection: A Review," Mathematics, MDPI, vol. 13(6), pages 1-29, March.

    More about this item

    Keywords

    ;
    ;
    ;
    ;
    ;
    ;
    ;
    ;

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:csdana:v:112:y:2017:i:c:p:242-256. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/csda .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.