IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0149794.html
   My bibliography  Save this article

A Bayesian Perspective on the Reproducibility Project: Psychology

Author

Listed:
  • Alexander Etz
  • Joachim Vandekerckhove

Abstract

We revisit the results of the recent Reproducibility Project: Psychology by the Open Science Collaboration. We compute Bayes factors—a quantity that can be used to express comparative evidence for an hypothesis but also for the null hypothesis—for a large subset (N = 72) of the original papers and their corresponding replication attempts. In our computation, we take into account the likely scenario that publication bias had distorted the originally published results. Overall, 75% of studies gave qualitatively similar results in terms of the amount of evidence provided. However, the evidence was often weak (i.e., Bayes factor

Suggested Citation

  • Alexander Etz & Joachim Vandekerckhove, 2016. "A Bayesian Perspective on the Reproducibility Project: Psychology," PLOS ONE, Public Library of Science, vol. 11(2), pages 1-12, February.
  • Handle: RePEc:plo:pone00:0149794
    DOI: 10.1371/journal.pone.0149794
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0149794
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0149794&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0149794?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Colin F. Camerer & Anna Dreber & Felix Holzmeister & Teck-Hua Ho & Jürgen Huber & Magnus Johannesson & Michael Kirchler & Gideon Nave & Brian A. Nosek & Thomas Pfeiffer & Adam Altmejd & Nick Buttrick , 2018. "Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015," Nature Human Behaviour, Nature, vol. 2(9), pages 637-644, September.
    2. Valentine, Kathrene D & Buchanan, Erin Michelle & Scofield, John E. & Beauchamp, Marshall T., 2017. "Beyond p-values: Utilizing Multiple Estimates to Evaluate Evidence," OSF Preprints 9hp7y, Center for Open Science.
    3. Fanelli, Daniele, 2022. "The "Tau" of Science - How to Measure, Study, and Integrate Quantitative and Qualitative Knowledge," MetaArXiv 67sak, Center for Open Science.
    4. Fanelli, Daniele, 2020. "Metascientific reproducibility patterns revealed by informatic measure of knowledge," MetaArXiv 5vnhj, Center for Open Science.
    5. Mathur, Maya B & VanderWeele, Tyler, 2018. "Statistical methods for evidence synthesis," Thesis Commons kd6ja, Center for Open Science.
    6. Minh-Hoang Nguyen & Tam-Tri Le & Hong-Kong To Nguyen & Manh-Toan Ho & Huyen T. Thanh Nguyen & Quan-Hoang Vuong, 2021. "Alice in Suicideland: Exploring the Suicidal Ideation Mechanism through the Sense of Connectedness and Help-Seeking Behaviors," IJERPH, MDPI, vol. 18(7), pages 1-24, April.
    7. Larry V. Hedges & Jacob M. Schauer, 2019. "More Than One Replication Study Is Needed for Unambiguous Tests of Replication," Journal of Educational and Behavioral Statistics, , vol. 44(5), pages 543-570, October.
    8. Zenker, Frank & Witte, Erich H., 2021. "The case for default point-H1-hypotheses: a theory-construction perspective," OSF Preprints zue4h, Center for Open Science.
    9. Samuel Pawel & Leonhard Held, 2022. "The sceptical Bayes factor for the assessment of replication success," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 84(3), pages 879-911, July.
    10. Larry V. Hedges & Jacob M. Schauer, 2021. "The design of replication studies," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 184(3), pages 868-886, July.
    11. Robbie C M van Aert & Marcel A L M van Assen, 2017. "Bayesian evaluation of effect size after replicating an original study," PLOS ONE, Public Library of Science, vol. 12(4), pages 1-23, April.
    12. Maya B. Mathur & Tyler J. VanderWeele, 2020. "New statistical metrics for multisite replication projects," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 183(3), pages 1145-1166, June.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0149794. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.