IDEAS home Printed from https://ideas.repec.org/a/gam/jmathe/v9y2021i22p2960-d683463.html
   My bibliography  Save this article

Fast Hyperparameter Calibration of Sparsity Enforcing Penalties in Total Generalised Variation Penalised Reconstruction Methods for XCT Using a Planted Virtual Reference Image

Author

Listed:
  • Stéphane Chrétien

    (Laboratoire ERIC, Université Lyon 2, 5 Av. Pierre Mendès-France, 69676 Bron, France)

  • Camille Giampiccolo

    (Laboratoire de Mathématiques de Besançon, UFR Sciences et Techniques, University de Bourgogne Franche-Comté, 16 Route de Gray, CEDEX, 25030 Besançon, France)

  • Wenjuan Sun

    (National Physical Laboratory, Hampton Road, Teddington TW11 0LW, UK)

  • Jessica Talbott

    (National Physical Laboratory, Hampton Road, Teddington TW11 0LW, UK)

Abstract

The reconstruction problem in X-ray computed tomography (XCT) is notoriously difficult in the case where only a small number of measurements are made. Based on the recently discovered Compressed Sensing paradigm, many methods have been proposed in order to address the reconstruction problem by leveraging inherent sparsity of the object’s decompositions in various appropriate bases or dictionaries. In practice, reconstruction is usually achieved by incorporating weighted sparsity enforcing penalisation functionals into the least-squares objective of the associated optimisation problem. One such penalisation functional is the Total Variation (TV) norm, which has been successfully employed since the early days of Compressed Sensing. Total Generalised Variation (TGV) is a recent improvement of this approach. One of the main advantages of such penalisation based approaches is that the resulting optimisation problem is convex and as such, cannot be affected by the possible existence of spurious solutions. Using the TGV penalisation nevertheless comes with the drawback of having to tune the two hyperparameters governing the TGV semi-norms. In this short note, we provide a simple and efficient recipe for fast hyperparameters tuning, based on the simple idea of virtually planting a mock image into the model. The proposed trick potentially applies to all linear inverse problems under the assumption that relevant prior information is available about the sought for solution, whilst being very different from the Bayesian method.

Suggested Citation

  • Stéphane Chrétien & Camille Giampiccolo & Wenjuan Sun & Jessica Talbott, 2021. "Fast Hyperparameter Calibration of Sparsity Enforcing Penalties in Total Generalised Variation Penalised Reconstruction Methods for XCT Using a Planted Virtual Reference Image," Mathematics, MDPI, vol. 9(22), pages 1-12, November.
  • Handle: RePEc:gam:jmathe:v:9:y:2021:i:22:p:2960-:d:683463
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2227-7390/9/22/2960/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2227-7390/9/22/2960/
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Cun-Hui Zhang & Stephanie S. Zhang, 2014. "Confidence intervals for low dimensional parameters in high dimensional linear models," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 76(1), pages 217-242, January.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Alexandre Belloni & Victor Chernozhukov & Kengo Kato, 2019. "Valid Post-Selection Inference in High-Dimensional Approximately Sparse Quantile Regression Models," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 114(526), pages 749-758, April.
    2. Chenchuan (Mark) Li & Ulrich K. Müller, 2021. "Linear regression with many controls of limited explanatory power," Quantitative Economics, Econometric Society, vol. 12(2), pages 405-442, May.
    3. Victor Chernozhukov & Whitney K. Newey & Victor Quintas-Martinez & Vasilis Syrgkanis, 2021. "Automatic Debiased Machine Learning via Riesz Regression," Papers 2104.14737, arXiv.org, revised Mar 2024.
    4. Guo, Xu & Li, Runze & Liu, Jingyuan & Zeng, Mudong, 2023. "Statistical inference for linear mediation models with high-dimensional mediators and application to studying stock reaction to COVID-19 pandemic," Journal of Econometrics, Elsevier, vol. 235(1), pages 166-179.
    5. Toshio Honda, 2021. "The de-biased group Lasso estimation for varying coefficient models," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 73(1), pages 3-29, February.
    6. Hansen, Christian & Liao, Yuan, 2019. "The Factor-Lasso And K-Step Bootstrap Approach For Inference In High-Dimensional Economic Applications," Econometric Theory, Cambridge University Press, vol. 35(3), pages 465-509, June.
    7. Alexandre Belloni & Victor Chernozhukov & Kengo Kato, 2013. "Uniform Post Selection Inference for LAD Regression and Other Z-estimation problems," Papers 1304.0282, arXiv.org, revised Oct 2020.
    8. Victor Chernozhukov & Denis Chetverikov & Mert Demirer & Esther Duflo & Christian Hansen & Whitney K. Newey, 2016. "Double machine learning for treatment and causal parameters," CeMMAP working papers 49/16, Institute for Fiscal Studies.
    9. Philipp Bach & Victor Chernozhukov & Malte S. Kurz & Martin Spindler & Sven Klaassen, 2021. "DoubleML -- An Object-Oriented Implementation of Double Machine Learning in R," Papers 2103.09603, arXiv.org, revised Jun 2024.
    10. Yumou Qiu & Jing Tao & Xiao‐Hua Zhou, 2021. "Inference of heterogeneous treatment effects using observational data with high‐dimensional covariates," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 83(5), pages 1016-1043, November.
    11. Semenova, Vira, 2023. "Debiased machine learning of set-identified linear models," Journal of Econometrics, Elsevier, vol. 235(2), pages 1725-1746.
    12. Celso Brunetti & Marc Joëts & Valérie Mignon, 2023. "Reasons Behind Words: OPEC Narratives and the Oil Market," Working Papers hal-04196053, HAL.
    13. Su, Miaomiao & Wang, Qihua, 2022. "A convex programming solution based debiased estimator for quantile with missing response and high-dimensional covariables," Computational Statistics & Data Analysis, Elsevier, vol. 168(C).
    14. Panxu Yuan & Xiao Guo, 2022. "High-dimensional inference for linear model with correlated errors," Metrika: International Journal for Theoretical and Applied Statistics, Springer, vol. 85(1), pages 21-52, January.
    15. Nicolas Städler & Sach Mukherjee, 2017. "Two-sample testing in high dimensions," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 79(1), pages 225-246, January.
    16. Agboola, Oluwagbenga David & Yu, Han, 2023. "Neighborhood-based cross fitting approach to treatment effects with high-dimensional data," Computational Statistics & Data Analysis, Elsevier, vol. 186(C).
    17. Fan, Jianqing & Feng, Yang & Xia, Lucy, 2020. "A projection-based conditional dependence measure with applications to high-dimensional undirected graphical models," Journal of Econometrics, Elsevier, vol. 218(1), pages 119-139.
    18. Guo, Xu & Li, Runze & Liu, Jingyuan & Zeng, Mudong, 2024. "Reprint: Statistical inference for linear mediation models with high-dimensional mediators and application to studying stock reaction to COVID-19 pandemic," Journal of Econometrics, Elsevier, vol. 239(2).
    19. Victor Chernozhukov & Christian Hansen & Martin Spindler, 2015. "Post-Selection and Post-Regularization Inference in Linear Models with Many Controls and Instruments," American Economic Review, American Economic Association, vol. 105(5), pages 486-490, May.
    20. Han, Dongxiao & Huang, Jian & Lin, Yuanyuan & Shen, Guohao, 2022. "Robust post-selection inference of high-dimensional mean regression with heavy-tailed asymmetric or heteroskedastic errors," Journal of Econometrics, Elsevier, vol. 230(2), pages 416-431.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jmathe:v:9:y:2021:i:22:p:2960-:d:683463. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.