IDEAS home Printed from https://ideas.repec.org/p/osf/metaar/wvdjf.html
   My bibliography  Save this paper

How Best to Quantify Replication Success? A Simulation Study on the Comparison of Replication Success Metrics

Author

Listed:
  • Muradchanian, Jasmine
  • Hoekstra, Rink
  • Kiers, Henk
  • van Ravenzwaaij, Don

    (University of Groningen)

Abstract

To overcome the frequently debated crisis of confidence, replicating studies is becoming increasingly more common. Multiple frequentist and Bayesian measures have been proposed to evaluate whether a replication is successful, but little is known about which method best captures replication success. We studied this in a simulation study, by comparing a number of quantitative measures of replication success with respect to their ability to draw the correct inference when the underlying truth is known, while taking publication bias into account. Our results show that Bayesian metrics seem to slightly outperform frequentist metrics across the board. Generally, meta-analytic approaches seem to slightly outperform metrics that evaluate single studies, except in the scenario of extreme publication bias, where this pattern reverses.

Suggested Citation

  • Muradchanian, Jasmine & Hoekstra, Rink & Kiers, Henk & van Ravenzwaaij, Don, 2020. "How Best to Quantify Replication Success? A Simulation Study on the Comparison of Replication Success Metrics," MetaArXiv wvdjf, Center for Open Science.
  • Handle: RePEc:osf:metaar:wvdjf
    DOI: 10.31219/osf.io/wvdjf
    as

    Download full text from publisher

    File URL: https://osf.io/download/5f2a9695b084f60289c9fdd1/
    Download Restriction: no

    File URL: https://libkey.io/10.31219/osf.io/wvdjf?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Leonhard Held, 2020. "A new standard for the analysis and design of replication studies," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 183(2), pages 431-448, February.
    2. John P A Ioannidis, 2005. "Why Most Published Research Findings Are False," PLOS Medicine, Public Library of Science, vol. 2(8), pages 1-1, August.
    3. Robbie C M van Aert & Marcel A L M van Assen, 2017. "Bayesian evaluation of effect size after replicating an original study," PLOS ONE, Public Library of Science, vol. 12(4), pages 1-23, April.
    4. Zacharias Maniadis & Fabio Tufano & John A. List, 2014. "One Swallow Doesn't Make a Summer: New Evidence on Anchoring Effects," American Economic Review, American Economic Association, vol. 104(1), pages 277-290, January.
    5. Andrew C. Chang & Phillip Li, 2015. "Is Economics Research Replicable? Sixty Published Papers from Thirteen Journals Say \"Usually Not\"," Finance and Economics Discussion Series 2015-83, Board of Governors of the Federal Reserve System (U.S.).
    6. Colin F. Camerer & Anna Dreber & Felix Holzmeister & Teck-Hua Ho & Jürgen Huber & Magnus Johannesson & Michael Kirchler & Gideon Nave & Brian A. Nosek & Thomas Pfeiffer & Adam Altmejd & Nick Buttrick , 2018. "Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015," Nature Human Behaviour, Nature, vol. 2(9), pages 637-644, September.
    7. Leonard P Freedman & Iain M Cockburn & Timothy S Simcoe, 2015. "The Economics of Reproducibility in Preclinical Research," PLOS Biology, Public Library of Science, vol. 13(6), pages 1-9, June.
    8. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Leon C Reteig & Lionel A Newman & K Richard Ridderinkhof & Heleen A Slagter, 2022. "Effects of tDCS on the attentional blink revisited: A statistical evaluation of a replication attempt," PLOS ONE, Public Library of Science, vol. 17(1), pages 1-23, January.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Colin F. Camerer & Anna Dreber & Felix Holzmeister & Teck-Hua Ho & Jürgen Huber & Magnus Johannesson & Michael Kirchler & Gideon Nave & Brian A. Nosek & Thomas Pfeiffer & Adam Altmejd & Nick Buttrick , 2018. "Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015," Nature Human Behaviour, Nature, vol. 2(9), pages 637-644, September.
    2. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    3. Leonhard Held, 2020. "A new standard for the analysis and design of replication studies," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 183(2), pages 431-448, February.
    4. Williams, Cole Randall, 2019. "How redefining statistical significance can worsen the replication crisis," Economics Letters, Elsevier, vol. 181(C), pages 65-69.
    5. Strømland, Eirik & Torsvik, Gaute, 2019. "Intuitive Prosociality: Heterogeneous Treatment Effects or False Positive?," OSF Preprints hrx2y, Center for Open Science.
    6. Isaiah Andrews & Maximilian Kasy, 2019. "Identification of and Correction for Publication Bias," American Economic Review, American Economic Association, vol. 109(8), pages 2766-2794, August.
    7. Maurizio Canavari & Andreas C. Drichoutis & Jayson L. Lusk & Rodolfo M. Nayga, Jr., 2018. "How to run an experimental auction: A review of recent advances," Working Papers 2018-5, Agricultural University of Athens, Department Of Agricultural Economics.
    8. Bull, Charles & Courty, Pascal & Doyon, Maurice & Rondeau, Daniel, 2019. "Failure of the Becker–DeGroot–Marschak mechanism in inexperienced subjects: New tests of the game form misconception hypothesis," Journal of Economic Behavior & Organization, Elsevier, vol. 159(C), pages 235-253.
    9. John A. List & Azeem M. Shaikh & Yang Xu, 2019. "Multiple hypothesis testing in experimental economics," Experimental Economics, Springer;Economic Science Association, vol. 22(4), pages 773-793, December.
    10. Grabiszewski, Konrad & Horenstein, Alex, 2020. "Effort is not a monotonic function of skills: Results from a global mobile experiment," Journal of Economic Behavior & Organization, Elsevier, vol. 176(C), pages 634-652.
    11. Hensel, Przemysław G., 2021. "Reproducibility and replicability crisis: How management compares to psychology and economics – A systematic review of literature," European Management Journal, Elsevier, vol. 39(5), pages 577-594.
    12. Chin, Jason & Zeiler, Kathryn, 2021. "Replicability in Empirical Legal Research," LawArXiv 2b5k4, Center for Open Science.
    13. Strømland, Eirik, 2019. "Preregistration and reproducibility," Journal of Economic Psychology, Elsevier, vol. 75(PA).
    14. Marquardt, Philipp & Noussair, Charles N & Weber, Martin, 2019. "Rational expectations in an experimental asset market with shocks to market trends," European Economic Review, Elsevier, vol. 114(C), pages 116-140.
    15. Mueller-Langer, Frank & Fecher, Benedikt & Harhoff, Dietmar & Wagner, Gert G., 2019. "Replication studies in economics—How many and which papers are chosen for replication, and why?," Research Policy, Elsevier, vol. 48(1), pages 62-83.
    16. Stanley, T. D. & Doucouliagos, Chris, 2019. "Practical Significance, Meta-Analysis and the Credibility of Economics," IZA Discussion Papers 12458, Institute of Labor Economics (IZA).
    17. Oliver Braganza, 2020. "A simple model suggesting economically rational sample-size choice drives irreproducibility," PLOS ONE, Public Library of Science, vol. 15(3), pages 1-19, March.
    18. Adam Altmejd & Anna Dreber & Eskil Forsell & Juergen Huber & Taisuke Imai & Magnus Johannesson & Michael Kirchler & Gideon Nave & Colin Camerer, 2019. "Predicting the replicability of social science lab experiments," PLOS ONE, Public Library of Science, vol. 14(12), pages 1-18, December.
    19. Buffat, Justin & Praxmarer, Matthias & Sutter, Matthias, 2023. "The intrinsic value of decision rights: A replication and an extension to team decision making," Journal of Economic Behavior & Organization, Elsevier, vol. 209(C), pages 560-571.
    20. Jeff Miller & Rolf Ulrich, 2019. "The quest for an optimal alpha," PLOS ONE, Public Library of Science, vol. 14(1), pages 1-13, January.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:osf:metaar:wvdjf. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: OSF (email available below). General contact details of provider: https://osf.io/preprints/metaarxiv .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.