IDEAS home Printed from https://ideas.repec.org/a/bla/ausecr/v51y2018i2p286-300.html
   My bibliography  Save this article

A Primer on the ‘Reproducibility Crisis’ and Ways to Fix It

Author

Listed:
  • W. Robert Reed

Abstract

This article uses the framework of Ioannidis () to organise a discussion of issues related to the ‘reproducibility crisis’. It then goes on to use that framework to evaluate various proposals to fix the problem. Of particular interest is the ‘post†study probability’, the probability that a reported research finding represents a true relationship. This probability is inherently unknowable. However, a number of insightful results emerge if we are willing to make some conjectures about reasonable parameter values. Among other things, this analysis demonstrates the important role that replication can play in improving the signal value of empirical research.

Suggested Citation

  • W. Robert Reed, 2018. "A Primer on the ‘Reproducibility Crisis’ and Ways to Fix It," Australian Economic Review, The University of Melbourne, Melbourne Institute of Applied Economic and Social Research, vol. 51(2), pages 286-300, June.
  • Handle: RePEc:bla:ausecr:v:51:y:2018:i:2:p:286-300
    DOI: 10.1111/1467-8462.12262
    as

    Download full text from publisher

    File URL: https://doi.org/10.1111/1467-8462.12262
    Download Restriction: no

    File URL: https://libkey.io/10.1111/1467-8462.12262?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. John P A Ioannidis, 2005. "Why Most Published Research Findings Are False," PLOS Medicine, Public Library of Science, vol. 2(8), pages 1-1, August.
    2. Eric W. Djimeu & Deo-Gracias Houndolo, 2016. "Power calculation for causal inference in social science: sample size and minimum detectable effect determination," Journal of Development Effectiveness, Taylor & Francis Journals, vol. 8(4), pages 508-527, October.
    3. Jennifer L. Castle & Xiaochuan Qin & W. Robert Reed, 2013. "Using Model Selection Algorithms To Obtain Reliable Coefficient Estimates," Journal of Economic Surveys, Wiley Blackwell, vol. 27(2), pages 269-296, April.
    4. W. Robert Reed, 2017. "Replication in labor economics," IZA World of Labor, Institute of Labor Economics (IZA), pages 413-413, December.
    5. Zacharias Maniadis & Fabio Tufano & John A. List, 2017. "To Replicate or Not To Replicate? Exploring Reproducibility in Economics through the Lens of a Model and a Pilot Study," Economic Journal, Royal Economic Society, vol. 127(605), pages 209-235, October.
    6. Lucas C. Coffman & Muriel Niederle, 2015. "Pre-analysis Plans Have Limited Upside, Especially Where Replications Are Feasible," Journal of Economic Perspectives, American Economic Association, vol. 29(3), pages 81-98, Summer.
    7. Maren Duvendack & Richard W. Palmer-Jones & W. Robert Reed, 2015. "Replications in Economics: A Progress Report," Econ Journal Watch, Econ Journal Watch, vol. 12(2), pages 164–191-1, May.
    8. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Mueller-Langer, Frank & Fecher, Benedikt & Harhoff, Dietmar & Wagner, Gert G., 2019. "Replication studies in economics—How many and which papers are chosen for replication, and why?," EconStor Open Access Articles and Book Chapters, ZBW - Leibniz Information Centre for Economics, vol. 48(1), pages 62-83.
    2. Hensel, Przemysław G., 2021. "Reproducibility and replicability crisis: How management compares to psychology and economics – A systematic review of literature," European Management Journal, Elsevier, vol. 39(5), pages 577-594.
    3. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    4. So, Tony, 2020. "Classroom experiments as a replication device," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 86(C).
    5. Weili Ding, 2020. "Laboratory experiments can pre-design to address power and selection issues," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 6(2), pages 125-138, December.
    6. Luigi Butera & Philip Grossman & Daniel Houser & John List & Marie-Claire Villeval, 2020. "A New Mechanism to Alleviate the Crises of Confidence in Science - With an Application to the Public Goods Game," Artefactual Field Experiments 00684, The Field Experiments Website.
    7. Nicolas Vallois & Dorian Jullien, 2017. "Replication in Experimental Economics: A Historical and Quantitative Approach Focused on Public Good Game Experiments," GREDEG Working Papers 2017-21, Groupe de REcherche en Droit, Economie, Gestion (GREDEG CNRS), Université Côte d'Azur, France.
    8. Nicolas Vallois & Dorian Jullien, 2017. "Replication in experimental economics: A historical and quantitative approach focused on public good game experiments," Université Paris1 Panthéon-Sorbonne (Post-Print and Working Papers) halshs-01651080, HAL.
    9. Drazen, Allan & Dreber, Anna & Ozbay, Erkut Y. & Snowberg, Erik, 2021. "Journal-based replication of experiments: An application to “Being Chosen to Lead”," Journal of Public Economics, Elsevier, vol. 202(C).
    10. Strømland, Eirik, 2019. "Preregistration and reproducibility," Journal of Economic Psychology, Elsevier, vol. 75(PA).
    11. Lionel Page & Charles N. Noussair & Robert Slonim, 2021. "The replication crisis, the rise of new research practices and what it means for experimental economics," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 7(2), pages 210-225, December.
    12. Strømland, Eirik & Torsvik, Gaute, 2019. "Intuitive Prosociality: Heterogeneous Treatment Effects or False Positive?," OSF Preprints hrx2y, Center for Open Science.
    13. Kathryn N. Vasilaky & J. Michelle Brock, 2020. "Power(ful) guidelines for experimental economists," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 6(2), pages 189-212, December.
    14. Alexander Frankel & Maximilian Kasy, 2022. "Which Findings Should Be Published?," American Economic Journal: Microeconomics, American Economic Association, vol. 14(1), pages 1-38, February.
    15. Neyse, Levent & Fossen, Frank M. & Johannesson, Magnus & Dreber, Anna, 2023. "Cognitive reflection and 2D:4D: Evidence from a large population sample," Journal of Economic Behavior & Organization, Elsevier, vol. 209(C), pages 288-307.
    16. Stanley, T. D. & Doucouliagos, Chris, 2019. "Practical Significance, Meta-Analysis and the Credibility of Economics," IZA Discussion Papers 12458, Institute of Labor Economics (IZA).
    17. Burlig, Fiona, 2018. "Improving transparency in observational social science research: A pre-analysis plan approach," Economics Letters, Elsevier, vol. 168(C), pages 56-60.
    18. Lucas C. Coffman & Muriel Niederle & Alistair J. Wilson, 2017. "A Proposal to Organize and Promote Replications," American Economic Review, American Economic Association, vol. 107(5), pages 41-45, May.
    19. Roy Chen & Yan Chen & Yohanes E. Riyanto, 2021. "Best practices in replication: a case study of common information in coordination games," Experimental Economics, Springer;Economic Science Association, vol. 24(1), pages 2-30, March.
    20. Colin F. Camerer & Anna Dreber & Felix Holzmeister & Teck-Hua Ho & Jürgen Huber & Magnus Johannesson & Michael Kirchler & Gideon Nave & Brian A. Nosek & Thomas Pfeiffer & Adam Altmejd & Nick Buttrick , 2018. "Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015," Nature Human Behaviour, Nature, vol. 2(9), pages 637-644, September.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:bla:ausecr:v:51:y:2018:i:2:p:286-300. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Wiley Content Delivery (email available below). General contact details of provider: https://edirc.repec.org/data/mimelau.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.