IDEAS home Printed from https://ideas.repec.org/p/osf/socarx/4hmb6.html
   My bibliography  Save this paper

Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015

Author

Listed:
  • Camerer, Colin
  • Dreber, Anna

    (Stockholm School of Economics)

  • Holzmeister, Felix

    (University of Innsbruck)

  • Ho, Teck Hua
  • Huber, Juergen
  • Johannesson, Magnus
  • Kirchler, Michael
  • Nave, Gideon
  • Nosek, Brian A.

    (University of Virginia)

  • Pfeiffer, Thomas

    (Massey University Auckland)

Abstract

Being able to replicate scientific findings is crucial for scientific progress. We replicate 21 systematically selected experimental studies in the social sciences published in Nature and Science between 2010 and 2015. The replications follow analysis plans reviewed by the original authors and pre-registered prior to the replications. The replications are high powered, with sample sizes on average about five times higher than in the original studies. We find a significant effect in the same direction as the original study for 13 (62%) studies, and the effect size of the replications is on average about 50% of the original effect size. Replicability varies between 12 (57%) and 14 (67%) studies for complementary replicability indicators. Consistent with these results, the estimated true positive rate is 67% in a Bayesian analysis. The relative effect size of true positives is estimated to be 71%, suggesting that both false positives and inflated effect sizes of true positives contribute to imperfect reproducibility. Furthermore, we find that peer beliefs of replicability are strongly related to replicability, suggesting that the research community could predict which results would replicate and that failures to replicate were not the result of chance alone.

Suggested Citation

  • Camerer, Colin & Dreber, Anna & Holzmeister, Felix & Ho, Teck Hua & Huber, Juergen & Johannesson, Magnus & Kirchler, Michael & Nave, Gideon & Nosek, Brian A. & Pfeiffer, Thomas, 2018. "Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015," SocArXiv 4hmb6, Center for Open Science.
  • Handle: RePEc:osf:socarx:4hmb6
    DOI: 10.31219/osf.io/4hmb6
    as

    Download full text from publisher

    File URL: https://osf.io/download/5b7ef2af8c6d9f001baeb8b3/
    Download Restriction: no

    File URL: https://libkey.io/10.31219/osf.io/4hmb6?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. John P A Ioannidis, 2005. "Why Most Published Research Findings Are False," PLOS Medicine, Public Library of Science, vol. 2(8), pages 1-1, August.
    2. Daniel J. Benjamin & James O. Berger & Magnus Johannesson & Brian A. Nosek & E.-J. Wagenmakers & Richard Berk & Kenneth A. Bollen & Björn Brembs & Lawrence Brown & Colin Camerer & David Cesarini & Chr, 2018. "Redefine statistical significance," Nature Human Behaviour, Nature, vol. 2(1), pages 6-10, January.
      • Daniel Benjamin & James Berger & Magnus Johannesson & Brian Nosek & E. Wagenmakers & Richard Berk & Kenneth Bollen & Bjorn Brembs & Lawrence Brown & Colin Camerer & David Cesarini & Christopher Chambe, 2017. "Redefine Statistical Significance," Artefactual Field Experiments 00612, The Field Experiments Website.
    3. Maxime Derex & Marie-Pauline Beugin & Bernard Godelle & Michel Raymond, 2013. "Experimental evidence for the influence of group size on cultural complexity," Nature, Nature, vol. 503(7476), pages 389-391, November.
    4. David G. Rand & Joshua D. Greene & Martin A. Nowak, 2012. "Spontaneous giving and calculated greed," Nature, Nature, vol. 489(7416), pages 427-430, September.
    5. Gelman, Andrew & Stern, Hal, 2006. "The Difference Between," The American Statistician, American Statistical Association, vol. 60, pages 328-331, November.
    6. Zacharias Maniadis & Fabio Tufano & John A. List, 2014. "One Swallow Doesn't Make a Summer: New Evidence on Anchoring Effects," American Economic Review, American Economic Association, vol. 104(1), pages 277-290, January.
    7. Akihiro Nishi & Hirokazu Shirado & David G. Rand & Nicholas A. Christakis, 2015. "Inequality and visibility of wealth in experimental social networks," Nature, Nature, vol. 526(7573), pages 426-429, October.
    8. C. Glenn Begley & Lee M. Ellis, 2012. "Raise standards for preclinical cancer research," Nature, Nature, vol. 483(7391), pages 531-533, March.
    9. Leonard P Freedman & Iain M Cockburn & Timothy S Simcoe, 2015. "The Economics of Reproducibility in Preclinical Research," PLOS Biology, Public Library of Science, vol. 13(6), pages 1-9, June.
    10. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    11. Alexander Etz & Joachim Vandekerckhove, 2016. "A Bayesian Perspective on the Reproducibility Project: Psychology," PLOS ONE, Public Library of Science, vol. 11(2), pages 1-12, February.
    12. Jesse Chandler & et. al, 2016. "Response to Comment on "Estimating the Reproducibility of Psychological Science"," Mathematica Policy Research Reports cff9c2f16bb544c4bcca530c0, Mathematica Policy Research.
    13. Nosek, Brian A. & Ebersole, Charles R. & DeHaven, Alexander Carl & Mellor, David Thomas, 2018. "The Preregistration Revolution," OSF Preprints 2dxu5, Center for Open Science.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    2. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    3. Muradchanian, Jasmine & Hoekstra, Rink & Kiers, Henk & van Ravenzwaaij, Don, 2020. "How Best to Quantify Replication Success? A Simulation Study on the Comparison of Replication Success Metrics," MetaArXiv wvdjf, Center for Open Science.
    4. Mueller-Langer, Frank & Fecher, Benedikt & Harhoff, Dietmar & Wagner, Gert G., 2019. "Replication studies in economics—How many and which papers are chosen for replication, and why?," Research Policy, Elsevier, vol. 48(1), pages 62-83.
    5. Williams, Cole Randall, 2019. "How redefining statistical significance can worsen the replication crisis," Economics Letters, Elsevier, vol. 181(C), pages 65-69.
    6. Strømland, Eirik & Torsvik, Gaute, 2019. "Intuitive Prosociality: Heterogeneous Treatment Effects or False Positive?," OSF Preprints hrx2y, Center for Open Science.
    7. John A. List & Azeem M. Shaikh & Yang Xu, 2019. "Multiple hypothesis testing in experimental economics," Experimental Economics, Springer;Economic Science Association, vol. 22(4), pages 773-793, December.
    8. Kiri, Bralind & Lacetera, Nicola & Zirulia, Lorenzo, 2018. "Above a swamp: A theory of high-quality scientific production," Research Policy, Elsevier, vol. 47(5), pages 827-839.
    9. Baltussen, Guido & Swinkels, Laurens & Van Vliet, Pim, 2021. "Global factor premiums," Journal of Financial Economics, Elsevier, vol. 142(3), pages 1128-1154.
    10. Bernhard Voelkl & Lucile Vogt & Emily S Sena & Hanno Würbel, 2018. "Reproducibility of preclinical animal research improves with heterogeneity of study samples," PLOS Biology, Public Library of Science, vol. 16(2), pages 1-13, February.
    11. Felix Holzmeister & Magnus Johannesson & Robert Böhm & Anna Dreber & Jürgen Huber & Michael Kirchler, 2023. "Heterogeneity in effect size estimates: Empirical evidence and practical implications," Working Papers 2023-17, Faculty of Economics and Statistics, Universität Innsbruck.
    12. Chin, Jason & Zeiler, Kathryn, 2021. "Replicability in Empirical Legal Research," LawArXiv 2b5k4, Center for Open Science.
    13. Hannah Fraser & Tim Parker & Shinichi Nakagawa & Ashley Barnett & Fiona Fidler, 2018. "Questionable research practices in ecology and evolution," PLOS ONE, Public Library of Science, vol. 13(7), pages 1-16, July.
    14. Leonhard Held, 2020. "A new standard for the analysis and design of replication studies," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 183(2), pages 431-448, February.
    15. Strømland, Eirik, 2019. "Preregistration and reproducibility," Journal of Economic Psychology, Elsevier, vol. 75(PA).
    16. Marquardt, Philipp & Noussair, Charles N & Weber, Martin, 2019. "Rational expectations in an experimental asset market with shocks to market trends," European Economic Review, Elsevier, vol. 114(C), pages 116-140.
    17. Adler, Susanne Jana & Röseler, Lukas & Schöniger, Martina Katharina, 2023. "A toolbox to evaluate the trustworthiness of published findings," Journal of Business Research, Elsevier, vol. 167(C).
    18. Robbie C M van Aert & Marcel A L M van Assen, 2017. "Bayesian evaluation of effect size after replicating an original study," PLOS ONE, Public Library of Science, vol. 12(4), pages 1-23, April.
    19. Beau Coker & Cynthia Rudin & Gary King, 2021. "A Theory of Statistical Inference for Ensuring the Robustness of Scientific Results," Management Science, INFORMS, vol. 67(10), pages 6174-6197, October.
    20. Michael Kirchler & David Andersson & Caroline Bonn & Magnus Johannesson & Erik Ø. Sørensen & Matthias Stefan & Gustav Tinghög & Daniel Västfjäll, 2017. "The effect of fast and slow decisions on risk taking," Journal of Risk and Uncertainty, Springer, vol. 54(1), pages 37-59, February.

    More about this item

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:osf:socarx:4hmb6. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: OSF (email available below). General contact details of provider: https://arabixiv.org .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.