IDEAS home Printed from https://ideas.repec.org/a/eee/csdana/v174y2022ics0167947322000305.html
   My bibliography  Save this article

Horseshoe shrinkage methods for Bayesian fusion estimation

Author

Listed:
  • Banerjee, Sayantan

Abstract

Estimation and structure learning of high-dimensional signals via a normal sequence model are considered, where the underlying parameter vector is piecewise constant, or has a block structure. A Bayesian fusion estimation method is developed by using the Horseshoe prior to induce a strong shrinkage effect on successive differences in the mean parameters, simultaneously imposing sufficient prior concentration for non-zero values of the same. Fast and efficient computational procedures are presented via Markov Chain Monte Carlo methods exploring the full posterior distributions of the underlying parameters, and theoretical justifications of the approach are also provided by deriving posterior convergence rates and establishing selection consistency under suitable assumptions. The proposed method is extended to signal de-noising over arbitrary graphs and efficient computational methods are developed along with theoretical guarantees. The superior performance of the Horseshoe based Bayesian fusion estimation method is demonstrated through extensive simulations and two real-life examples on signal de-noising in biological and geophysical applications. The estimation performance of the method is also demonstrated on a real-world large network for the graph signal de-noising problem.

Suggested Citation

  • Banerjee, Sayantan, 2022. "Horseshoe shrinkage methods for Bayesian fusion estimation," Computational Statistics & Data Analysis, Elsevier, vol. 174(C).
  • Handle: RePEc:eee:csdana:v:174:y:2022:i:c:s0167947322000305
    DOI: 10.1016/j.csda.2022.107450
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0167947322000305
    Download Restriction: Full text for ScienceDirect subscribers only.

    File URL: https://libkey.io/10.1016/j.csda.2022.107450?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Robert Tibshirani & Michael Saunders & Saharon Rosset & Ji Zhu & Keith Knight, 2005. "Sparsity and smoothness via the fused lasso," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 67(1), pages 91-108, February.
    2. Park, Trevor & Casella, George, 2008. "The Bayesian Lasso," Journal of the American Statistical Association, American Statistical Association, vol. 103, pages 681-686, June.
    3. Veronika Ročková & Edward I. George, 2018. "The Spike-and-Slab LASSO," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 113(521), pages 431-444, January.
    4. Carlos M. Carvalho & Nicholas G. Polson & James G. Scott, 2010. "The horseshoe estimator for sparse signals," Biometrika, Biometrika Trust, vol. 97(2), pages 465-480.
    5. Daniel R. Kowal & David S. Matteson & David Ruppert, 2019. "Dynamic shrinkage processes," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 81(4), pages 781-804, September.
    6. Anirban Bhattacharya & Debdeep Pati & Natesh S. Pillai & David B. Dunson, 2015. "Dirichlet--Laplace Priors for Optimal Shrinkage," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 110(512), pages 1479-1490, December.
    7. Patrick Danaher & Pei Wang & Daniela M. Witten, 2014. "The joint graphical lasso for inverse covariance estimation across multiple classes," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 76(2), pages 373-397, March.
    8. Hui Zou & Trevor Hastie, 2005. "Addendum: Regularization and variable selection via the elastic net," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 67(5), pages 768-768, November.
    9. Qifan Song & Guang Cheng, 2020. "Bayesian Fusion Estimation via t Shrinkage," Sankhya A: The Indian Journal of Statistics, Springer;Indian Statistical Institute, vol. 82(2), pages 353-385, August.
    10. Kaito Shimamura & Masao Ueki & Shuichi Kawano & Sadanori Konishi, 2019. "Bayesian generalized fused lasso modeling via NEG distribution," Communications in Statistics - Theory and Methods, Taylor & Francis Journals, vol. 48(16), pages 4132-4153, August.
    11. Hui Zou & Trevor Hastie, 2005. "Regularization and variable selection via the elastic net," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 67(2), pages 301-320, April.
    12. Fan J. & Li R., 2001. "Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 96, pages 1348-1360, December.
    13. Follett, Lendie & Yu, Cindy, 2019. "Achieving parsimony in Bayesian vector autoregressions with the horseshoe prior," Econometrics and Statistics, Elsevier, vol. 11(C), pages 130-144.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Posch, Konstantin & Arbeiter, Maximilian & Pilz, Juergen, 2020. "A novel Bayesian approach for variable selection in linear regression models," Computational Statistics & Data Analysis, Elsevier, vol. 144(C).
    2. Kshitij Khare & Malay Ghosh, 2022. "MCMC Convergence for Global-Local Shrinkage Priors," Journal of Quantitative Economics, Springer;The Indian Econometric Society (TIES), vol. 20(1), pages 211-234, September.
    3. Hu, Guanyu, 2021. "Spatially varying sparsity in dynamic regression models," Econometrics and Statistics, Elsevier, vol. 17(C), pages 23-34.
    4. Minerva Mukhopadhyay & David B. Dunson, 2020. "Targeted Random Projection for Prediction From High-Dimensional Features," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 115(532), pages 1998-2010, December.
    5. Mogliani, Matteo & Simoni, Anna, 2021. "Bayesian MIDAS penalized regressions: Estimation, selection, and prediction," Journal of Econometrics, Elsevier, vol. 222(1), pages 833-860.
    6. Shi, Guiling & Lim, Chae Young & Maiti, Tapabrata, 2019. "Model selection using mass-nonlocal prior," Statistics & Probability Letters, Elsevier, vol. 147(C), pages 36-44.
    7. Dimitris Korobilis & Kenichi Shimizu, 2022. "Bayesian Approaches to Shrinkage and Sparse Estimation," Foundations and Trends(R) in Econometrics, now publishers, vol. 11(4), pages 230-354, June.
    8. Tanin Sirimongkolkasem & Reza Drikvandi, 2019. "On Regularisation Methods for Analysis of High Dimensional Data," Annals of Data Science, Springer, vol. 6(4), pages 737-763, December.
    9. van Erp, Sara & Oberski, Daniel L. & Mulder, Joris, 2018. "Shrinkage priors for Bayesian penalized regression," OSF Preprints cg8fq, Center for Open Science.
    10. Diego Vidaurre & Concha Bielza & Pedro Larrañaga, 2013. "A Survey of L1 Regression," International Statistical Review, International Statistical Institute, vol. 81(3), pages 361-387, December.
    11. Qifan Song & Guang Cheng, 2020. "Bayesian Fusion Estimation via t Shrinkage," Sankhya A: The Indian Journal of Statistics, Springer;Indian Statistical Institute, vol. 82(2), pages 353-385, August.
    12. Matthew Gentzkow & Bryan T. Kelly & Matt Taddy, 2017. "Text as Data," NBER Working Papers 23276, National Bureau of Economic Research, Inc.
    13. Taha Alshaybawee & Habshah Midi & Rahim Alhamzawi, 2017. "Bayesian elastic net single index quantile regression," Journal of Applied Statistics, Taylor & Francis Journals, vol. 44(5), pages 853-871, April.
    14. Anindya Bhadra & Jyotishka Datta & Nicholas G. Polson & Brandon T. Willard, 2021. "The Horseshoe-Like Regularization for Feature Subset Selection," Sankhya B: The Indian Journal of Statistics, Springer;Indian Statistical Institute, vol. 83(1), pages 185-214, May.
    15. Sandra Stankiewicz, 2015. "Forecasting Euro Area Macroeconomic Variables with Bayesian Adaptive Elastic Net," Working Paper Series of the Department of Economics, University of Konstanz 2015-12, Department of Economics, University of Konstanz.
    16. Niloy Biswas & Anirban Bhattacharya & Pierre E. Jacob & James E. Johndrow, 2022. "Coupling‐based convergence assessment of some Gibbs samplers for high‐dimensional Bayesian regression with shrinkage priors," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 84(3), pages 973-996, July.
    17. Tutz, Gerhard & Pößnecker, Wolfgang & Uhlmann, Lorenz, 2015. "Variable selection in general multinomial logit models," Computational Statistics & Data Analysis, Elsevier, vol. 82(C), pages 207-222.
    18. Fan, Jianqing & Jiang, Bai & Sun, Qiang, 2022. "Bayesian factor-adjusted sparse regression," Journal of Econometrics, Elsevier, vol. 230(1), pages 3-19.
    19. Yize Zhao & Matthias Chung & Brent A. Johnson & Carlos S. Moreno & Qi Long, 2016. "Hierarchical Feature Selection Incorporating Known and Novel Biological Information: Identifying Genomic Features Related to Prostate Cancer Recurrence," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 111(516), pages 1427-1439, October.
    20. Hauzenberger, Niko, 2021. "Flexible Mixture Priors for Large Time-varying Parameter Models," Econometrics and Statistics, Elsevier, vol. 20(C), pages 87-108.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:csdana:v:174:y:2022:i:c:s0167947322000305. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/csda .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.