IDEAS home Printed from https://ideas.repec.org/p/nbr/nberwo/27250.html
   My bibliography  Save this paper

Research Registries: Facts, Myths, and Possible Improvements

Author

Listed:
  • Eliot Abrams
  • Jonathan Libgober
  • John A. List

Abstract

The past few decades have ushered in an experimental revolution in economics whereby scholars are now much more likely to generate their own data. While there are virtues associated with this movement, there are concomitant difficulties. Several scientific disciplines, including economics, have launched research registries in an effort to attenuate key inferential issues. This study assesses registries both empirically and theoretically, with a special focus on the AEA registry. We find that over 90% of randomized control trials (RCTs) in economics do not register, only 50% of the RCTs that register do so before the intervention begins, and the majority of these preregistrations are not detailed enough to significantly aid inference. Our empirical analysis further shows that using other scientific registries as aspirational examples is misguided, as their perceived success in tackling the main issues is largely a myth. In light of these facts, we advance a simple economic model to explore potential improvements. A key insight from the model is that removal of the (current) option to register completed RCTs could increase the fraction of trials that register. We also argue that linking IRB applications to registrations could further increase registry effectiveness.

Suggested Citation

  • Eliot Abrams & Jonathan Libgober & John A. List, 2020. "Research Registries: Facts, Myths, and Possible Improvements," NBER Working Papers 27250, National Bureau of Economic Research, Inc.
  • Handle: RePEc:nbr:nberwo:27250
    Note: DEV ED PE
    as

    Download full text from publisher

    File URL: http://www.nber.org/papers/w27250.pdf
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Robinson, Joan, 1977. "What Are the Questions?," Journal of Economic Literature, American Economic Association, vol. 15(4), pages 1318-1339, December.
    2. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    3. Abhijit Banerjee & Esther Duflo & Amy Finkelstein & Lawrence F. Katz & Benjamin A. Olken & Anja Sautmann, 2020. "In Praise of Moderation: Suggestions for the Scope and Use of Pre-Analysis Plans for RCTs in Economics," NBER Working Papers 26993, National Bureau of Economic Research, Inc.
    4. Omar Al-Ubaydli & John List & Dana Suskind, 2019. "The science of using science: Towards an understanding of the threats to scaling experiments," Artefactual Field Experiments 00670, The Field Experiments Website.
    5. Rachael Meager, 2019. "Understanding the Average Impact of Microcredit Expansions: A Bayesian Hierarchical Analysis of Seven Randomized Experiments," American Economic Journal: Applied Economics, American Economic Association, vol. 11(1), pages 57-91, January.
    6. Eva Vivalt, 2019. "Specification Searching and Significance Inflation Across Time, Methods and Disciplines," Oxford Bulletin of Economics and Statistics, Department of Economics, University of Oxford, vol. 81(4), pages 797-816, August.
    7. Richard A. Bettis, 2012. "The search for asterisks: Compromised statistical tests and flawed theories," Strategic Management Journal, Wiley Blackwell, vol. 33(1), pages 108-113, January.
    8. Shuo Liu & Harry Pei, 2017. "Monotone equilibria in signalling games," ECON - Working Papers 252, Department of Economics - University of Zurich.
    9. Eva Vivalt, 0. "How Much Can We Generalize From Impact Evaluations?," Journal of the European Economic Association, European Economic Association, vol. 18(6), pages 3045-3089.
    10. Michael D. Jennions & Anders Pape Møller, 2003. "A survey of the statistical power of research in behavioral ecology and animal behavior," Behavioral Ecology, International Society for Behavioral Ecology, vol. 14(3), pages 438-445, May.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Alec Brandon & Justin E. Holz & Andrew Simon & Haruka Uchida, 2023. "Minimum Wages and Racial Discrimination in Hiring: Evidence from a Field Experiment," Upjohn Working Papers 23-389, W.E. Upjohn Institute for Employment Research.
    2. Thibaut Arpinon & Romain Espinosa, 2023. "A practical guide to Registered Reports for economists," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 9(1), pages 90-122, June.
    3. John A. List & Ragan Petrie & Anya Samek, 2023. "How Experiments with Children Inform Economics," Journal of Economic Literature, American Economic Association, vol. 61(2), pages 504-564, June.
    4. Thibaut Arpinon & Romain Espinosa, 2023. "A Practical Guide to Registered Reports for Economists," Post-Print halshs-03897719, HAL.
    5. Brodeur, Abel & Cook, Nikolai M. & Hartley, Jonathan S. & Heyes, Anthony, 2023. "Do Pre-Registration and Pre-Analysis Plans Reduce p-Hacking and Publication Bias?: Evidence from 15,992 Test Statistics and Suggestions for Improvement," GLO Discussion Paper Series 1147 [pre.], Global Labor Organization (GLO).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    2. Stefano DellaVigna & Elizabeth Linos, 2022. "RCTs to Scale: Comprehensive Evidence From Two Nudge Units," Econometrica, Econometric Society, vol. 90(1), pages 81-116, January.
    3. Bruns, Stephan B. & Asanov, Igor & Bode, Rasmus & Dunger, Melanie & Funk, Christoph & Hassan, Sherif M. & Hauschildt, Julia & Heinisch, Dominik & Kempa, Karol & König, Johannes & Lips, Johannes & Verb, 2019. "Reporting errors and biases in published empirical findings: Evidence from innovation research," Research Policy, Elsevier, vol. 48(9), pages 1-1.
    4. Omar Al-Ubaydli & John List & Claire Mackevicius & Min Sok Lee & Dana Suskind, 2019. "How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling," Artefactual Field Experiments 00679, The Field Experiments Website.
    5. Brodeur, Abel & Cook, Nikolai M. & Hartley, Jonathan S. & Heyes, Anthony, 2023. "Do Pre-Registration and Pre-Analysis Plans Reduce p-Hacking and Publication Bias?: Evidence from 15,992 Test Statistics and Suggestions for Improvement," GLO Discussion Paper Series 1147 [pre.], Global Labor Organization (GLO).
    6. Hensel, Przemysław G., 2019. "Supporting replication research in management journals: Qualitative analysis of editorials published between 1970 and 2015," European Management Journal, Elsevier, vol. 37(1), pages 45-57.
    7. Graham Elliott & Nikolay Kudrin & Kaspar Wüthrich, 2022. "Detecting p‐Hacking," Econometrica, Econometric Society, vol. 90(2), pages 887-906, March.
    8. Marcel Fafchamps & Julien Labonne, 2016. "Using Split Samples to Improve Inference about Causal Effects," NBER Working Papers 21842, National Bureau of Economic Research, Inc.
    9. Fernando Hoces de la Guardia & Sean Grant & Edward Miguel, 2021. "A framework for open policy analysis," Science and Public Policy, Oxford University Press, vol. 48(2), pages 154-163.
    10. Brodeur, Abel & Cook, Nikolai & Heyes, Anthony, 2022. "We Need to Talk about Mechanical Turk: What 22,989 Hypothesis Tests Tell us about p-Hacking and Publication Bias in Online Experiments," GLO Discussion Paper Series 1157, Global Labor Organization (GLO).
    11. Nathan Fiala & Ana Garcia-Hernandez & Kritika Narula & Nishith Prakash, 2022. "Wheels of Change: Transforming Girls’ Lives with Bicycles," Working papers 2022-04, University of Connecticut, Department of Economics.
    12. Graham Elliott & Nikolay Kudrin & Kaspar Wuthrich, 2022. "The Power of Tests for Detecting $p$-Hacking," Papers 2205.07950, arXiv.org, revised Jun 2023.
    13. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    14. Josephson, Anna & Michler, Jeffrey D., 2018. "Viewpoint: Beasts of the field? Ethics in agricultural and applied economics," Food Policy, Elsevier, vol. 79(C), pages 1-11.
    15. Kaiser, Tim & Lusardi, Annamaria & Menkhoff, Lukas & Urban, Carly, 2022. "Financial education affects financial knowledge and downstream behaviors," Journal of Financial Economics, Elsevier, vol. 145(2), pages 255-272.
    16. Brodeur, Abel & Cook, Nikolai & Hartley, Jonathan & Heyes, Anthony, 2022. "Do Pre-Registration and Pre-analysis Plans Reduce p-Hacking and Publication Bias?," MetaArXiv uxf39, Center for Open Science.
    17. Cristina Blanco-Perez & Abel Brodeur, 2020. "Publication Bias and Editorial Statement on Negative Findings," The Economic Journal, Royal Economic Society, vol. 130(629), pages 1226-1247.
    18. Abel Brodeur & Nikolai Cook & Carina Neisser, 2022. "P-Hacking, Data Type and Data-Sharing Policy," ECONtribute Discussion Papers Series 200, University of Bonn and University of Cologne, Germany.
    19. Annie Duflo & Jessica Kiessel & Adrienne Lucas, 2020. "Experimental Evidence on Alternative Policies to Increase Learning at Scale," NBER Working Papers 27298, National Bureau of Economic Research, Inc.
    20. Sarah A. Janzen & Jeffrey D. Michler, 2021. "Ulysses' pact or Ulysses' raft: Using pre‐analysis plans in experimental and nonexperimental research," Applied Economic Perspectives and Policy, John Wiley & Sons, vol. 43(4), pages 1286-1304, December.

    More about this item

    JEL classification:

    • B41 - Schools of Economic Thought and Methodology - - Economic Methodology - - - Economic Methodology
    • C9 - Mathematical and Quantitative Methods - - Design of Experiments
    • C91 - Mathematical and Quantitative Methods - - Design of Experiments - - - Laboratory, Individual Behavior
    • C92 - Mathematical and Quantitative Methods - - Design of Experiments - - - Laboratory, Group Behavior
    • C93 - Mathematical and Quantitative Methods - - Design of Experiments - - - Field Experiments

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nbr:nberwo:27250. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: the person in charge (email available below). General contact details of provider: https://edirc.repec.org/data/nberrus.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.