IDEAS home Printed from https://ideas.repec.org/a/bla/jorssb/v84y2022i3p879-911.html
   My bibliography  Save this article

The sceptical Bayes factor for the assessment of replication success

Author

Listed:
  • Samuel Pawel
  • Leonhard Held

Abstract

Replication studies are increasingly conducted but there is no established statistical criterion for replication success. We propose a novel approach combining reverse‐Bayes analysis with Bayesian hypothesis testing: a sceptical prior is determined for the effect size such that the original finding is no longer convincing in terms of a Bayes factor. This prior is then contrasted to an advocacy prior (the reference posterior of the effect size based on the original study), and replication success is declared if the replication data favour the advocacy over the sceptical prior at a higher level than the original data favoured the sceptical prior over the null hypothesis. The sceptical Bayes factor is the highest level where replication success can be declared. A comparison to existing methods reveals that the sceptical Bayes factor combines several notions of replicability: it ensures that both studies show sufficient evidence against the null and penalises incompatibility of their effect estimates. Analysis of asymptotic properties and error rates, as well as case studies from the Social Sciences Replication Project show the advantages of the method for the assessment of replicability.

Suggested Citation

  • Samuel Pawel & Leonhard Held, 2022. "The sceptical Bayes factor for the assessment of replication success," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 84(3), pages 879-911, July.
  • Handle: RePEc:bla:jorssb:v:84:y:2022:i:3:p:879-911
    DOI: 10.1111/rssb.12491
    as

    Download full text from publisher

    File URL: https://doi.org/10.1111/rssb.12491
    Download Restriction: no

    File URL: https://libkey.io/10.1111/rssb.12491?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Leonhard Held, 2020. "A new standard for the analysis and design of replication studies," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 183(2), pages 431-448, February.
    2. Maya B. Mathur & Tyler J. VanderWeele, 2020. "New statistical metrics for multisite replication projects," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 183(3), pages 1145-1166, June.
    3. Maxime Derex & Marie-Pauline Beugin & Bernard Godelle & Michel Raymond, 2013. "Experimental evidence for the influence of group size on cultural complexity," Nature, Nature, vol. 503(7476), pages 389-391, November.
    4. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    5. Alexander Etz & Joachim Vandekerckhove, 2016. "A Bayesian Perspective on the Reproducibility Project: Psychology," PLOS ONE, Public Library of Science, vol. 11(2), pages 1-12, February.
    6. Larry V. Hedges & Jacob M. Schauer, 2019. "More Than One Replication Study Is Needed for Unambiguous Tests of Replication," Journal of Educational and Behavioral Statistics, , vol. 44(5), pages 543-570, October.
    7. Valen E. Johnson & Richard D. Payne & Tianying Wang & Alex Asher & Soutrik Mandal, 2017. "On the Reproducibility of Psychological Science," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 112(517), pages 1-10, January.
    8. Christopher Harms, 2019. "A Bayes Factor for Replications of ANOVA Results," The American Statistician, Taylor & Francis Journals, vol. 73(4), pages 327-339, October.
    9. Liang, Feng & Paulo, Rui & Molina, German & Clyde, Merlise A. & Berger, Jim O., 2008. "Mixtures of g Priors for Bayesian Variable Selection," Journal of the American Statistical Association, American Statistical Association, vol. 103, pages 410-423, March.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Colin F. Camerer & Anna Dreber & Felix Holzmeister & Teck-Hua Ho & Jürgen Huber & Magnus Johannesson & Michael Kirchler & Gideon Nave & Brian A. Nosek & Thomas Pfeiffer & Adam Altmejd & Nick Buttrick , 2018. "Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015," Nature Human Behaviour, Nature, vol. 2(9), pages 637-644, September.
    2. Larry V. Hedges & Jacob M. Schauer, 2021. "The design of replication studies," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 184(3), pages 868-886, July.
    3. Dreber, Anna & Johannesson, Magnus, 2023. "A framework for evaluating reproducibility and replicability in economics," Ruhr Economic Papers 1055, RWI - Leibniz-Institut für Wirtschaftsforschung, Ruhr-University Bochum, TU Dortmund University, University of Duisburg-Essen.
    4. Oliver Braganza, 2020. "A simple model suggesting economically rational sample-size choice drives irreproducibility," PLOS ONE, Public Library of Science, vol. 15(3), pages 1-19, March.
    5. Fanelli, Daniele, 2020. "Metascientific reproducibility patterns revealed by informatic measure of knowledge," MetaArXiv 5vnhj, Center for Open Science.
    6. Freuli, Francesca & Held, Leonhard & Heyard, Rachel, 2022. "Replication Success under Questionable Research Practices - A Simulation Study," I4R Discussion Paper Series 2, The Institute for Replication (I4R).
    7. Muradchanian, Jasmine & Hoekstra, Rink & Kiers, Henk & van Ravenzwaaij, Don, 2020. "How Best to Quantify Replication Success? A Simulation Study on the Comparison of Replication Success Metrics," MetaArXiv wvdjf, Center for Open Science.
    8. Lawrence L. Kupper & Sandra L. Martin, 2022. "Replication study design: confidence intervals and commentary," Statistical Papers, Springer, vol. 63(5), pages 1577-1583, October.
    9. Jeff Miller & Rolf Ulrich, 2019. "The quest for an optimal alpha," PLOS ONE, Public Library of Science, vol. 14(1), pages 1-13, January.
    10. Mathur, Maya B & VanderWeele, Tyler, 2018. "Statistical methods for evidence synthesis," Thesis Commons kd6ja, Center for Open Science.
    11. Fanelli, Daniele, 2022. "The "Tau" of Science - How to Measure, Study, and Integrate Quantitative and Qualitative Knowledge," MetaArXiv 67sak, Center for Open Science.
    12. Freuli, Francesca & Held, Leonhard & Heyard, Rachel, 2022. "Replication success under questionable research practices – a simulation study," MetaArXiv s4b65, Center for Open Science.
    13. Forsell, Eskil & Viganola, Domenico & Pfeiffer, Thomas & Almenberg, Johan & Wilson, Brad & Chen, Yiling & Nosek, Brian A. & Johannesson, Magnus & Dreber, Anna, 2019. "Predicting replication outcomes in the Many Labs 2 study," Journal of Economic Psychology, Elsevier, vol. 75(PA).
    14. Robbie C M van Aert & Marcel A L M van Assen, 2017. "Bayesian evaluation of effect size after replicating an original study," PLOS ONE, Public Library of Science, vol. 12(4), pages 1-23, April.
    15. Domenico Giannone & Michele Lenza & Lucrezia Reichlin, 2011. "Market Freedom and the Global Recession," IMF Economic Review, Palgrave Macmillan;International Monetary Fund, vol. 59(1), pages 111-135, April.
    16. Alexander Frankel & Maximilian Kasy, 2022. "Which Findings Should Be Published?," American Economic Journal: Microeconomics, American Economic Association, vol. 14(1), pages 1-38, February.
    17. Omar Al-Ubaydli & John List & Claire Mackevicius & Min Sok Lee & Dana Suskind, 2019. "How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling," Artefactual Field Experiments 00679, The Field Experiments Website.
    18. Tom Coupé & W. Robert Reed, 2021. "Do Negative Replications Affect Citations?," Working Papers in Economics 21/14, University of Canterbury, Department of Economics and Finance.
    19. Mueller-Langer, Frank & Andreoli-Versbach, Patrick, 2018. "Open access to research data: Strategic delay and the ambiguous welfare effects of mandatory data disclosure," Information Economics and Policy, Elsevier, vol. 42(C), pages 20-34.
    20. Ons Jedidi & Jean Sébastien Pentecote, 2015. "Robust Signals for Banking Crises," Economics Bulletin, AccessEcon, vol. 35(3), pages 1617-1629.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:bla:jorssb:v:84:y:2022:i:3:p:879-911. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Wiley Content Delivery (email available below). General contact details of provider: https://edirc.repec.org/data/rssssea.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.