IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0175302.html
   My bibliography  Save this article

Bayesian evaluation of effect size after replicating an original study

Author

Listed:
  • Robbie C M van Aert
  • Marcel A L M van Assen

Abstract

The vast majority of published results in the literature is statistically significant, which raises concerns about their reliability. The Reproducibility Project Psychology (RPP) and Experimental Economics Replication Project (EE-RP) both replicated a large number of published studies in psychology and economics. The original study and replication were statistically significant in 36.1% in RPP and 68.8% in EE-RP suggesting many null effects among the replicated studies. However, evidence in favor of the null hypothesis cannot be examined with null hypothesis significance testing. We developed a Bayesian meta-analysis method called snapshot hybrid that is easy to use and understand and quantifies the amount of evidence in favor of a zero, small, medium and large effect. The method computes posterior model probabilities for a zero, small, medium, and large effect and adjusts for publication bias by taking into account that the original study is statistically significant. We first analytically approximate the methods performance, and demonstrate the necessity to control for the original study’s significance to enable the accumulation of evidence for a true zero effect. Then we applied the method to the data of RPP and EE-RP, showing that the underlying effect sizes of the included studies in EE-RP are generally larger than in RPP, but that the sample sizes of especially the included studies in RPP are often too small to draw definite conclusions about the true effect size. We also illustrate how snapshot hybrid can be used to determine the required sample size of the replication akin to power analysis in null hypothesis significance testing and present an easy to use web application (https://rvanaert.shinyapps.io/snapshot/) and R code for applying the method.

Suggested Citation

  • Robbie C M van Aert & Marcel A L M van Assen, 2017. "Bayesian evaluation of effect size after replicating an original study," PLOS ONE, Public Library of Science, vol. 12(4), pages 1-23, April.
  • Handle: RePEc:plo:pone00:0175302
    DOI: 10.1371/journal.pone.0175302
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0175302
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0175302&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0175302?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. C. Glenn Begley & Lee M. Ellis, 2012. "Raise standards for preclinical cancer research," Nature, Nature, vol. 483(7391), pages 531-533, March.
    2. Daniele Fanelli, 2010. "Do Pressures to Publish Increase Scientists' Bias? An Empirical Support from US States Data," PLOS ONE, Public Library of Science, vol. 5(4), pages 1-7, April.
    3. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    4. Alexander Etz & Joachim Vandekerckhove, 2016. "A Bayesian Perspective on the Reproducibility Project: Psychology," PLOS ONE, Public Library of Science, vol. 11(2), pages 1-12, February.
    5. Daniele Fanelli, 2012. "Negative results are disappearing from most disciplines and countries," Scientometrics, Springer;Akadémiai Kiadó, vol. 90(3), pages 891-904, March.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Muradchanian, Jasmine & Hoekstra, Rink & Kiers, Henk & van Ravenzwaaij, Don, 2020. "How Best to Quantify Replication Success? A Simulation Study on the Comparison of Replication Success Metrics," MetaArXiv wvdjf, Center for Open Science.
    2. Leonhard Held, 2020. "A new standard for the analysis and design of replication studies," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 183(2), pages 431-448, February.
    3. Larry V. Hedges & Jacob M. Schauer, 2019. "More Than One Replication Study Is Needed for Unambiguous Tests of Replication," Journal of Educational and Behavioral Statistics, , vol. 44(5), pages 543-570, October.
    4. van Aert, Robbie Cornelis Maria, 2018. "Dissertation R.C.M. van Aert," MetaArXiv eqhjd, Center for Open Science.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Colin F. Camerer & Anna Dreber & Felix Holzmeister & Teck-Hua Ho & Jürgen Huber & Magnus Johannesson & Michael Kirchler & Gideon Nave & Brian A. Nosek & Thomas Pfeiffer & Adam Altmejd & Nick Buttrick , 2018. "Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015," Nature Human Behaviour, Nature, vol. 2(9), pages 637-644, September.
    2. Oliver Braganza, 2020. "A simple model suggesting economically rational sample-size choice drives irreproducibility," PLOS ONE, Public Library of Science, vol. 15(3), pages 1-19, March.
    3. Mark D Lindner & Richard K Nakamura, 2015. "Examining the Predictive Validity of NIH Peer Review Scores," PLOS ONE, Public Library of Science, vol. 10(6), pages 1-12, June.
    4. Mark J. McCabe & Frank Mueller-Langer, 2019. "Does Data Disclosure Increase Citations? Empirical Evidence from a Natural Experiment in Leading Economics Journals," JRC Working Papers on Digital Economy 2019-02, Joint Research Centre.
    5. Fanelli, Daniele, 2020. "Metascientific reproducibility patterns revealed by informatic measure of knowledge," MetaArXiv 5vnhj, Center for Open Science.
    6. Kiri, Bralind & Lacetera, Nicola & Zirulia, Lorenzo, 2018. "Above a swamp: A theory of high-quality scientific production," Research Policy, Elsevier, vol. 47(5), pages 827-839.
    7. Mueller-Langer, Frank & Fecher, Benedikt & Harhoff, Dietmar & Wagner, Gert G., 2019. "Replication studies in economics—How many and which papers are chosen for replication, and why?," Research Policy, Elsevier, vol. 48(1), pages 62-83.
    8. Daniele Fanelli & Rodrigo Costas & Vincent Larivière, 2015. "Misconduct Policies, Academic Culture and Career Stage, Not Gender or Pressures to Publish, Affect Scientific Integrity," PLOS ONE, Public Library of Science, vol. 10(6), pages 1-18, June.
    9. Baltussen, Guido & Swinkels, Laurens & Van Vliet, Pim, 2021. "Global factor premiums," Journal of Financial Economics, Elsevier, vol. 142(3), pages 1128-1154.
    10. Fecher, Benedikt & Fräßdorf, Mathis & Hebing, Marcel & Wagner, Gert G., 2017. "Replikationen, Reputation und gute wissenschaftliche Praxis," EconStor Open Access Articles and Book Chapters, ZBW - Leibniz Information Centre for Economics, vol. 68(2-3), pages 154-158.
    11. Hladchenko, Myroslava & Moed, Henk F., 2021. "The effect of publication traditions and requirements in research assessment and funding policies upon the use of national journals in 28 post-socialist countries," Journal of Informetrics, Elsevier, vol. 15(4).
    12. Schweinsberg, Martin & Feldman, Michael & Staub, Nicola & van den Akker, Olmo R. & van Aert, Robbie C.M. & van Assen, Marcel A.L.M. & Liu, Yang & Althoff, Tim & Heer, Jeffrey & Kale, Alex & Mohamed, Z, 2021. "Same data, different conclusions: Radical dispersion in empirical results when independent analysts operationalize and test the same hypothesis," Organizational Behavior and Human Decision Processes, Elsevier, vol. 165(C), pages 228-249.
    13. Mathur, Maya B & VanderWeele, Tyler, 2018. "Statistical methods for evidence synthesis," Thesis Commons kd6ja, Center for Open Science.
    14. Tom Coupé & W. Robert Reed & Christian Zimmerman, 2021. "Paving the Road for Replications: Experimental Results from an Online Research Repository," Working Papers in Economics 21/09, University of Canterbury, Department of Economics and Finance.
    15. Sergio Copiello, 2020. "The alleged citation advantage of video abstracts may be a matter of self-citations and self-selection bias. Comment on “The impact of video abstract on citation counts” by Zong et al," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(1), pages 751-757, January.
    16. van Aert, Robbie Cornelis Maria, 2018. "Dissertation R.C.M. van Aert," MetaArXiv eqhjd, Center for Open Science.
    17. Samuel Pawel & Leonhard Held, 2022. "The sceptical Bayes factor for the assessment of replication success," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 84(3), pages 879-911, July.
    18. Tierney, Warren & Hardy, Jay H. & Ebersole, Charles R. & Leavitt, Keith & Viganola, Domenico & Clemente, Elena Giulia & Gordon, Michael & Dreber, Anna & Johannesson, Magnus & Pfeiffer, Thomas & Uhlman, 2020. "Creative destruction in science," Organizational Behavior and Human Decision Processes, Elsevier, vol. 161(C), pages 291-309.
    19. Siobhan C Dongés & Jessica M D’Amico & Jane E Butler & Janet L Taylor, 2017. "The effects of cervical transcutaneous spinal direct current stimulation on motor pathways supplying the upper limb in humans," PLOS ONE, Public Library of Science, vol. 12(2), pages 1-20, February.
    20. Joeri K Tijdink & Anton C M Vergouwen & Yvo M Smulders, 2013. "Publication Pressure and Burn Out among Dutch Medical Professors: A Nationwide Survey," PLOS ONE, Public Library of Science, vol. 8(9), pages 1-6, September.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0175302. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.