IDEAS home Printed from https://ideas.repec.org/p/osf/metaar/vgr89.html
   My bibliography  Save this paper

Does preregistration improve the credibility of research findings?

Author

Listed:
  • Rubin, Mark

    (The University of Newcastle, Australia)

Abstract

Preregistration entails researchers registering their planned research hypotheses, methods, and analyses in a time-stamped document before they undertake their data collection and analyses. This document is then made available with the published research report to allow readers to identify discrepancies between what the researchers originally planned to do and what they actually ended up doing. This historical transparency is supposed to facilitate judgments about the credibility of the research findings. The present article provides a critical review of 17 of the reasons behind this argument. The article covers issues such as HARKing, multiple testing, p-hacking, forking paths, optional stopping, researchers’ biases, selective reporting, test severity, publication bias, and replication rates. It is concluded that preregistration’s historical transparency does not facilitate judgments about the credibility of research findings when researchers provide contemporary transparency in the form of (a) clear rationales for current hypotheses and analytical approaches, (b) public access to research data, materials, and code, and (c) demonstrations of the robustness of research conclusions to alternative interpretations and analytical approaches.

Suggested Citation

  • Rubin, Mark, 2020. "Does preregistration improve the credibility of research findings?," MetaArXiv vgr89, Center for Open Science.
  • Handle: RePEc:osf:metaar:vgr89
    DOI: 10.31219/osf.io/vgr89
    as

    Download full text from publisher

    File URL: https://osf.io/download/5f8e94188fc43a00638d71c8/
    Download Restriction: no

    File URL: https://libkey.io/10.31219/osf.io/vgr89?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Spanos, Aris, 2010. "Akaike-type criteria and the reliability of inference: Model selection versus statistical model specification," Journal of Econometrics, Elsevier, vol. 158(2), pages 204-220, October.
    2. Vancouver, Jeffrey B., 2018. "In Defense of HARKing," Industrial and Organizational Psychology, Cambridge University Press, vol. 11(1), pages 73-80, March.
    3. Nosek, Brian A. & Ebersole, Charles R. & DeHaven, Alexander Carl & Mellor, David Thomas, 2018. "The Preregistration Revolution," OSF Preprints 2dxu5, Center for Open Science.
    4. Leung, Kwok, 2011. "Presenting Post Hoc Hypotheses as A Priori: Ethical and Theoretical Issues," Management and Organization Review, Cambridge University Press, vol. 7(3), pages 471-479, November.
    5. Nancy Reid & David R. Cox, 2015. "On Some Principles of Statistical Inference," International Statistical Review, International Statistical Institute, vol. 83(2), pages 293-308, August.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Fanelli, Daniele, 2022. "The "Tau" of Science - How to Measure, Study, and Integrate Quantitative and Qualitative Knowledge," MetaArXiv 67sak, Center for Open Science.
    2. Aguilera-Cobos, Lorena & Rosario-Lozano, María Piedad & Ponce-Polo, Angela & Blasco-Amaro, Juan Antonio & Epstein, David, 2022. "Barriers for the evaluation of advanced therapy medicines and their translation to clinical practice: Umbrella review," Health Policy, Elsevier, vol. 126(12), pages 1248-1255.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Mattia Prosperi & Jiang Bian & Iain E. Buchan & James S. Koopman & Matthew Sperrin & Mo Wang, 2019. "Raiders of the lost HARK: a reproducible inference framework for big data science," Palgrave Communications, Palgrave Macmillan, vol. 5(1), pages 1-12, December.
    2. Bao, Te & Diks, Cees & Li, Hao, 2018. "A generalized CAPM model with asymmetric power distributed errors with an application to portfolio construction," Economic Modelling, Elsevier, vol. 68(C), pages 611-621.
    3. Vigren, Andreas & Pyddoke, Roger, 2020. "The impact on bus ridership of passenger incentive contracts in public transport," Transportation Research Part A: Policy and Practice, Elsevier, vol. 135(C), pages 144-159.
    4. Jasper Brinkerink, 2023. "When Shooting for the Stars Becomes Aiming for Asterisks: P-Hacking in Family Business Research," Entrepreneurship Theory and Practice, , vol. 47(2), pages 304-343, March.
    5. Hensel, Przemysław G., 2019. "Supporting replication research in management journals: Qualitative analysis of editorials published between 1970 and 2015," European Management Journal, Elsevier, vol. 37(1), pages 45-57.
    6. Elbæk, Christian T. & Lystbæk, Martin Nørhede & Mitkidis, Panagiotis, 2022. "On the psychology of bonuses: The effects of loss aversion and Yerkes-Dodson law on performance in cognitively and mechanically demanding tasks," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 98(C).
    7. Colin F. Camerer & Anna Dreber & Felix Holzmeister & Teck-Hua Ho & Jürgen Huber & Magnus Johannesson & Michael Kirchler & Gideon Nave & Brian A. Nosek & Thomas Pfeiffer & Adam Altmejd & Nick Buttrick , 2018. "Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015," Nature Human Behaviour, Nature, vol. 2(9), pages 637-644, September.
    8. Bettina Bert & Céline Heinl & Justyna Chmielewska & Franziska Schwarz & Barbara Grune & Andreas Hensel & Matthias Greiner & Gilbert Schönfelder, 2019. "Refining animal research: The Animal Study Registry," PLOS Biology, Public Library of Science, vol. 17(10), pages 1-12, October.
    9. Spanos, Aris, 2010. "Statistical adequacy and the trustworthiness of empirical evidence: Statistical vs. substantive information," Economic Modelling, Elsevier, vol. 27(6), pages 1436-1452, November.
    10. Nathalie Percie du Sert & Viki Hurst & Amrita Ahluwalia & Sabina Alam & Marc T Avey & Monya Baker & William J Browne & Alejandra Clark & Innes C Cuthill & Ulrich Dirnagl & Michael Emerson & Paul Garne, 2020. "The ARRIVE guidelines 2.0: Updated guidelines for reporting animal research," PLOS Biology, Public Library of Science, vol. 18(7), pages 1-12, July.
    11. Benson Honig & Joseph Lampel & Donald Siegel & Paul Drnevich, 2014. "Ethics in the Production and Dissemination of Management Research: Institutional Failure or Individual Fallibility?," Journal of Management Studies, Wiley Blackwell, vol. 51(1), pages 118-142, January.
    12. Aris Spanos, 2016. "Transforming structural econometrics: substantive vs. statistical premises of inference," Review of Political Economy, Taylor & Francis Journals, vol. 28(3), pages 426-437, July.
    13. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    14. Reed, W. Robert, 2019. "Takeaways from the special issue on The Practice of Replication," Economics - The Open-Access, Open-Assessment E-Journal (2007-2020), Kiel Institute for the World Economy (IfW Kiel), vol. 13, pages 1-11.
    15. Anna Conte & Peter Moffatt, 2014. "The econometric modelling of social preferences," Theory and Decision, Springer, vol. 76(1), pages 119-145, January.
    16. Shuaijun Guo & Xiaoming Yu & Orkan Okan, 2020. "Moving Health Literacy Research and Practice towards a Vision of Equity, Precision and Transparency," IJERPH, MDPI, vol. 17(20), pages 1-14, October.
    17. Ori Katz & Eyal Zamir, 2021. "Do People Like Mandatory Rules? The Choice Between Disclosures, Defaults, and Mandatory Rules in Supplier‐Customer Relationships," Journal of Empirical Legal Studies, John Wiley & Sons, vol. 18(2), pages 421-460, June.
    18. Michaelides, Michael & Spanos, Aris, 2020. "On modeling heterogeneity in linear models using trend polynomials," Economic Modelling, Elsevier, vol. 85(C), pages 74-86.
    19. Andreas Fügener & Jörn Grahl & Alok Gupta & Wolfgang Ketter, 2022. "Cognitive Challenges in Human–Artificial Intelligence Collaboration: Investigating the Path Toward Productive Delegation," Information Systems Research, INFORMS, vol. 33(2), pages 678-696, June.
    20. P. Dorian Owen, 2017. "Evaluating Ingenious Instruments for Fundamental Determinants of Long-Run Economic Growth and Development," Econometrics, MDPI, vol. 5(3), pages 1-33, September.

    More about this item

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:osf:metaar:vgr89. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: OSF (email available below). General contact details of provider: https://osf.io/preprints/metaarxiv .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.