IDEAS home Printed from https://ideas.repec.org/p/feb/artefa/00703.html
   My bibliography  Save this paper

Research Registries: Facts, Myths, and Possible Improvements

Author

Listed:
  • Eliot Abrams
  • Jonathan Libgober
  • John List

Abstract

The past few decades have ushered in an experimental revolution in economics whereby scholars are now much more likely to generate their own data. While there are virtues associated with this movement, there are concomitant difficulties. Several scientific disciplines, including economics, have launched research registries in an effort to attenuate key inferential issues. This study assesses registries both empirically and theoretically, with a special focus on the AEA registry. We find that over 90% of randomized controlled trials (RCTs) in economics do not register, only 50% of the RCTs that register do so before the intervention begins, and the majority of these preregistrations are not detailed enough to significantly aid inference. Our empirical analysis further shows that using other scientific registries as aspirational examples is misguided, as their perceived success in tackling the main issues is largely a myth. In light of these facts, we advance a simple economic model to explore potential improvements. A key insight from the model is that removal of the (current) option to register completed RCTs could increase the fraction of trials that register. We also argue that linking IRB applications to registrations could further increase registry effectiveness.

Suggested Citation

  • Eliot Abrams & Jonathan Libgober & John List, 2020. "Research Registries: Facts, Myths, and Possible Improvements," Artefactual Field Experiments 00703, The Field Experiments Website.
  • Handle: RePEc:feb:artefa:00703
    as

    Download full text from publisher

    File URL: http://s3.amazonaws.com/fieldexperiments-papers2/papers/00703.pdf
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Robinson, Joan, 1977. "What Are the Questions?," Journal of Economic Literature, American Economic Association, vol. 15(4), pages 1318-1339, December.
    2. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    3. Abhijit Banerjee & Esther Duflo & Amy Finkelstein & Lawrence F. Katz & Benjamin A. Olken & Anja Sautmann, 2020. "In Praise of Moderation: Suggestions for the Scope and Use of Pre-Analysis Plans for RCTs in Economics," NBER Working Papers 26993, National Bureau of Economic Research, Inc.
    4. Omar Al-Ubaydli & John List & Dana Suskind, 2019. "The science of using science: Towards an understanding of the threats to scaling experiments," Artefactual Field Experiments 00670, The Field Experiments Website.
    5. Rachael Meager, 2019. "Understanding the Average Impact of Microcredit Expansions: A Bayesian Hierarchical Analysis of Seven Randomized Experiments," American Economic Journal: Applied Economics, American Economic Association, vol. 11(1), pages 57-91, January.
    6. Eva Vivalt, 2019. "Specification Searching and Significance Inflation Across Time, Methods and Disciplines," Oxford Bulletin of Economics and Statistics, Department of Economics, University of Oxford, vol. 81(4), pages 797-816, August.
    7. Richard A. Bettis, 2012. "The search for asterisks: Compromised statistical tests and flawed theories," Strategic Management Journal, Wiley Blackwell, vol. 33(1), pages 108-113, January.
    8. Shuo Liu & Harry Pei, 2017. "Monotone equilibria in signalling games," ECON - Working Papers 252, Department of Economics - University of Zurich.
    9. Eva Vivalt, 0. "How Much Can We Generalize From Impact Evaluations?," Journal of the European Economic Association, European Economic Association, vol. 18(6), pages 3045-3089.
    10. Michael D. Jennions & Anders Pape Møller, 2003. "A survey of the statistical power of research in behavioral ecology and animal behavior," Behavioral Ecology, International Society for Behavioral Ecology, vol. 14(3), pages 438-445, May.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. John A. List & Ragan Petrie & Anya Samek, 2023. "How Experiments with Children Inform Economics," Journal of Economic Literature, American Economic Association, vol. 61(2), pages 504-564, June.
    2. Brodeur, Abel & Cook, Nikolai M. & Hartley, Jonathan S. & Heyes, Anthony, 2023. "Do Pre-Registration and Pre-Analysis Plans Reduce p-Hacking and Publication Bias?: Evidence from 15,992 Test Statistics and Suggestions for Improvement," GLO Discussion Paper Series 1147 [pre.], Global Labor Organization (GLO).
    3. Alec Brandon & Justin E. Holz & Andrew Simon & Haruka Uchida, 2023. "Minimum Wages and Racial Discrimination in Hiring: Evidence from a Field Experiment," Upjohn Working Papers 23-389, W.E. Upjohn Institute for Employment Research.
    4. Thibaut Arpinon & Romain Espinosa, 2023. "A practical guide to Registered Reports for economists," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 9(1), pages 90-122, June.
    5. Thibaut Arpinon & Romain Espinosa, 2023. "A Practical Guide to Registered Reports for Economists," Post-Print halshs-03897719, HAL.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    2. Omar Al-Ubaydli & John List & Claire Mackevicius & Min Sok Lee & Dana Suskind, 2019. "How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling," Artefactual Field Experiments 00679, The Field Experiments Website.
    3. Stefano DellaVigna & Elizabeth Linos, 2022. "RCTs to Scale: Comprehensive Evidence From Two Nudge Units," Econometrica, Econometric Society, vol. 90(1), pages 81-116, January.
    4. Brodeur, Abel & Cook, Nikolai M. & Hartley, Jonathan S. & Heyes, Anthony, 2023. "Do Pre-Registration and Pre-Analysis Plans Reduce p-Hacking and Publication Bias?: Evidence from 15,992 Test Statistics and Suggestions for Improvement," GLO Discussion Paper Series 1147 [pre.], Global Labor Organization (GLO).
    5. Bruns, Stephan B. & Asanov, Igor & Bode, Rasmus & Dunger, Melanie & Funk, Christoph & Hassan, Sherif M. & Hauschildt, Julia & Heinisch, Dominik & Kempa, Karol & König, Johannes & Lips, Johannes & Verb, 2019. "Reporting errors and biases in published empirical findings: Evidence from innovation research," Research Policy, Elsevier, vol. 48(9), pages 1-1.
    6. Abel Brodeur, Nikolai M. Cook, Anthony Heyes, 2022. "We Need to Talk about Mechanical Turk: What 22,989 Hypothesis Tests Tell Us about Publication Bias and p-Hacking in Online Experiments," LCERPA Working Papers am0133, Laurier Centre for Economic Research and Policy Analysis.
    7. Hensel, Przemysław G., 2019. "Supporting replication research in management journals: Qualitative analysis of editorials published between 1970 and 2015," European Management Journal, Elsevier, vol. 37(1), pages 45-57.
    8. Bruns, Stephan B. & Ioannidis, John P.A., 2020. "Determinants of economic growth: Different time different answer?," Journal of Macroeconomics, Elsevier, vol. 63(C).
    9. Graham Elliott & Nikolay Kudrin & Kaspar Wüthrich, 2022. "Detecting p‐Hacking," Econometrica, Econometric Society, vol. 90(2), pages 887-906, March.
    10. Marcel Fafchamps & Julien Labonne, 2016. "Using Split Samples to Improve Inference about Causal Effects," NBER Working Papers 21842, National Bureau of Economic Research, Inc.
    11. Fernando Hoces de la Guardia & Sean Grant & Edward Miguel, 2021. "A framework for open policy analysis," Science and Public Policy, Oxford University Press, vol. 48(2), pages 154-163.
    12. Brodeur, Abel & Cook, Nikolai & Heyes, Anthony, 2022. "We Need to Talk about Mechanical Turk: What 22,989 Hypothesis Tests Tell us about p-Hacking and Publication Bias in Online Experiments," GLO Discussion Paper Series 1157, Global Labor Organization (GLO).
    13. Andor, Mark A. & Gerster, Andreas & Peters, Jörg, 2022. "Information campaigns for residential energy conservation," European Economic Review, Elsevier, vol. 144(C).
    14. Mueller-Langer, Frank & Fecher, Benedikt & Harhoff, Dietmar & Wagner, Gert G., 2019. "Replication studies in economics—How many and which papers are chosen for replication, and why?," Research Policy, Elsevier, vol. 48(1), pages 62-83.
    15. Nathan Fiala & Ana Garcia-Hernandez & Kritika Narula & Nishith Prakash, 2022. "Wheels of Change: Transforming Girls’ Lives with Bicycles," Working papers 2022-04, University of Connecticut, Department of Economics.
    16. Graham Elliott & Nikolay Kudrin & Kaspar Wuthrich, 2022. "The Power of Tests for Detecting $p$-Hacking," Papers 2205.07950, arXiv.org, revised Jun 2023.
    17. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    18. Heckelei, Thomas & Huettel, Silke & Odening, Martin & Rommel, Jens, 2021. "The replicability crisis and the p-value debate – what are the consequences for the agricultural and food economics community?," Discussion Papers 316369, University of Bonn, Institute for Food and Resource Economics.
    19. Andrew C. Chang & Trace J. Levinson, 2020. "Raiders of the Lost High-Frequency Forecasts: New Data and Evidence on the Efficiency of the Fed's Forecasting," Finance and Economics Discussion Series 2020-090, Board of Governors of the Federal Reserve System (U.S.).
    20. Stephan B. Bruns, 2016. "The Fragility of Meta-Regression Models in Observational Research," MAGKS Papers on Economics 201603, Philipps-Universität Marburg, Faculty of Business Administration and Economics, Department of Economics (Volkswirtschaftliche Abteilung).

    More about this item

    JEL classification:

    • B41 - Schools of Economic Thought and Methodology - - Economic Methodology - - - Economic Methodology
    • C9 - Mathematical and Quantitative Methods - - Design of Experiments
    • C91 - Mathematical and Quantitative Methods - - Design of Experiments - - - Laboratory, Individual Behavior
    • C92 - Mathematical and Quantitative Methods - - Design of Experiments - - - Laboratory, Group Behavior
    • C93 - Mathematical and Quantitative Methods - - Design of Experiments - - - Field Experiments

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:feb:artefa:00703. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: David Franks (email available below). General contact details of provider: http://www.fieldexperiments.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.