IDEAS home Printed from https://ideas.repec.org/p/lmu/muenar/78212.html

Predicting the replicability of social science lab experiments

Author

Listed:
  • Altmejd, Adam
  • Dreber, Anna
  • Forsell, Eskil
  • Huber, Jürgen
  • Imai, Taisuke
  • Johannesson, Magnus
  • Kirchler, Michael
  • Nave, Gideon
  • Camerer, Colin

Abstract

We measure how accurately replication of experimental results can be predicted by black-box statistical models. With data from four large-scale replication projects in experimental psychology and economics, and techniques from machine learning, we train predictive models and study which variables drive predictable replication. The models predicts binary replication with a cross-validated accuracy rate of 70% (AUC of 0.77) and estimates of relative effect sizes with a Spearman rho of 0.38. The accuracy level is similar to market-aggregated beliefs of peer scientists [1, 2]. The predictive power is validated in a pre-registered out of sample test of the outcome of [3], where 71% (AUC of 0.73) of replications are predicted correctly and effect size correlations amount to rho = 0.25. Basic features such as the sample and effect sizes in original papers, and whether reported effects are single-variable main effects or two-variable interactions, are predictive of successful replication. The models presented in this paper are simple tools to produce cheap, prognostic replicability metrics. These models could be useful in institutionalizing the process of evaluation of new findings and guiding resources to those direct replications that are likely to be most informative.

Suggested Citation

  • Altmejd, Adam & Dreber, Anna & Forsell, Eskil & Huber, Jürgen & Imai, Taisuke & Johannesson, Magnus & Kirchler, Michael & Nave, Gideon & Camerer, Colin, 2019. "Predicting the replicability of social science lab experiments," Munich Reprints in Economics 78212, University of Munich, Department of Economics.
  • Handle: RePEc:lmu:muenar:78212
    as

    Download full text from publisher

    File URL: https://epub.ub.uni-muenchen.de/78212/1/78212.pdf
    Download Restriction: no
    ---><---

    Other versions of this item:

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Xingyu Li & Jiting Liu & Weijia Gao & Geoffrey L Cohen, 2024. "Challenging the N-Heuristic: Effect size, not sample size, predicts the replicability of psychological science," PLOS ONE, Public Library of Science, vol. 19(8), pages 1-15, August.
    2. Isager, Peder Mortvedt & van 't Veer, Anna Elisabeth & Lakens, Daniel, 2021. "Replication value as a function of citation impact and sample size," MetaArXiv knjea, Center for Open Science.
    3. Alexandru Marcoci & David P. Wilkinson & Ans Vercammen & Bonnie C. Wintle & Anna Lou Abatayo & Ernest Baskin & Henk Berkman & Erin M. Buchanan & Sara Capitán & Tabaré Capitán & Ginny Chan & Kent Jason, 2025. "Predicting the replicability of social and behavioural science claims in COVID-19 preprints," Nature Human Behaviour, Nature, vol. 9(2), pages 287-304, February.
    4. Felix Holzmeister & Magnus Johannesson & Colin F. Camerer & Yiling Chen & Teck-Hua Ho & Suzanne Hoogeveen & Juergen Huber & Noriko Imai & Taisuke Imai & Lawrence Jin & Michael Kirchler & Alexander Ly , 2025. "Examining the replicability of online experiments selected by a decision market," Nature Human Behaviour, Nature, vol. 9(2), pages 316-330, February.
    5. Vu, Patrick, 2024. "Why are replication rates so low?," Journal of Econometrics, Elsevier, vol. 245(1).
    6. Duncan Ermini Leaf, 2023. "Risk management in the use of published statistical results for policy decisions," Papers 2305.03205, arXiv.org, revised Aug 2024.
    7. Jindrich Matousek & Tomas Havranek & Zuzana Irsova, 2022. "Individual discount rates: a meta-analysis of experimental evidence," Experimental Economics, Springer;Economic Science Association, vol. 25(1), pages 318-358, February.
    8. Heyard, Rachel & Held, Leonhard, 2024. "Meta-regression to explain shrinkage and heterogeneity in large-scale replication projects," MetaArXiv e9nw2, Center for Open Science.
    9. Alipourfard, Nazanin & Arendt, Beatrix & Benjamin, Daniel Jacob & Benkler, Noam & Bishop, Michael Metcalf & Burstein, Mark & Bush, Martin & Caverlee, James & Chen, Yiling & Clark, Chae, 2021. "Systematizing Confidence in Open Research and Evidence (SCORE)," SocArXiv 46mnb, Center for Open Science.
    10. Adler, Susanne Jana & Röseler, Lukas & Schöniger, Martina Katharina, 2023. "A toolbox to evaluate the trustworthiness of published findings," Journal of Business Research, Elsevier, vol. 167(C).
    11. Diya Dou & Daniel T. L. Shek & Xiaoqin Zhu & Li Zhao, 2021. "Dimensionality of the Chinese CES-D: Is It Stable across Gender, Time, and Samples?," IJERPH, MDPI, vol. 18(22), pages 1-11, November.
    12. Jens Rommel & Meike Weltin, 2021. "Is There a Cult of Statistical Significance in Agricultural Economics?," Applied Economic Perspectives and Policy, John Wiley & Sons, vol. 43(3), pages 1176-1191, September.
    13. repec:osf:metaar:e9nw2_v1 is not listed on IDEAS

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:lmu:muenar:78212. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Tamilla Benkelberg (email available below). General contact details of provider: https://edirc.repec.org/data/vfmunde.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.