IDEAS home Printed from https://ideas.repec.org/p/cdl/econwp/qt7fc7s8cd.html

Evidence on Research Transparency in Economics

Author

Listed:
  • Miguel, Edward

Abstract

A decade ago, the term “research transparency” was not on economists' radar screen, but in a few short years a scholarly movement has emerged to bring new open science practices, tools and norms into the mainstream of our discipline. The goal of this article is to lay out the evidence on the adoption of these approaches – in three specific areas: open data, pre-registration and pre-analysis plans, and journal policies – and, more tentatively, begin to assess their impacts on the quality and credibility of economics research. The evidence to date indicates that economics (and related quantitative social science fields) are in a period of rapid transition toward new transparency-enhancing norms. While solid data on the benefits of these practices in economics is still limited, in part due to their relatively recent adoption, there is growing reason to believe that critics' worst fears regarding onerous adoption costs have not been realized. Finally, the article presents a set of frontier questions and potential innovations.

Suggested Citation

  • Miguel, Edward, 2021. "Evidence on Research Transparency in Economics," Department of Economics, Working Paper Series qt7fc7s8cd, Department of Economics, Institute for Business and Economic Research, UC Berkeley.
  • Handle: RePEc:cdl:econwp:qt7fc7s8cd
    as

    Download full text from publisher

    File URL: https://www.escholarship.org/uc/item/7fc7s8cd.pdf;origin=repeccitec
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Stefano DellaVigna & Devin Pope, 2018. "Predicting Experimental Results: Who Knows What?," Journal of Political Economy, University of Chicago Press, vol. 126(6), pages 2410-2456.
    2. Burlig, Fiona, 2018. "Improving transparency in observational social science research: A pre-analysis plan approach," Economics Letters, Elsevier, vol. 168(C), pages 56-60.
    3. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    4. Marcel Fafchamps & Julien Labonne, 2016. "Using Split Samples to Improve Inference about Causal Effects," NBER Working Papers 21842, National Bureau of Economic Research, Inc.
    5. Lars Vilhuber, 2021. "AEA Data and Code Availability Policy," AEA Papers and Proceedings, American Economic Association, vol. 111, pages 818-823, May.
    6. Leamer, Edward E, 1983. "Let's Take the Con Out of Econometrics," American Economic Review, American Economic Association, vol. 73(1), pages 31-43, March.
    7. George K. Ofosu & Daniel N. Posner, 2020. "Do Pre-analysis Plans Hamper Publication?," AEA Papers and Proceedings, American Economic Association, vol. 110, pages 70-74, May.
    8. Abel Brodeur & Nikolai Cook & Anthony Heyes, 2020. "Methods Matter: p-Hacking and Publication Bias in Causal Analysis in Economics," American Economic Review, American Economic Association, vol. 110(11), pages 3634-3660, November.
    9. Paul Gertler & Sebastian Galiani & Mauricio Romero, 2018. "How to make replication the norm," Nature, Nature, vol. 554(7693), pages 417-419, February.
    10. Garret Christensen & Allan Dafoe & Edward Miguel & Don A Moore & Andrew K Rose, 2019. "A study of the impact of data sharing on article citations using journal policies as a natural experiment," PLOS ONE, Public Library of Science, vol. 14(12), pages 1-13, December.
    11. Ofosu, George K. & Posner, Daniel N., 2020. "Do pre-analysis plans hamper publication?," LSE Research Online Documents on Economics 112748, London School of Economics and Political Science, LSE Library.
    12. Nicholas Swanson & Garret Christensen & Rebecca Littman & David Birke & Edward Miguel & Elizabeth Levy Paluck & Zenan Wang, 2020. "Research Transparency Is on the Rise in Economics," AEA Papers and Proceedings, American Economic Association, vol. 110, pages 61-65, May.
    13. Jens Ludwig & Sendhil Mullainathan & Jann Spiess, 2019. "Augmenting Pre-Analysis Plans with Machine Learning," AEA Papers and Proceedings, American Economic Association, vol. 109, pages 71-76, May.
    14. Tom E. Hardwicke & John P. A. Ioannidis, 2018. "Mapping the universe of registered reports," Nature Human Behaviour, Nature, vol. 2(11), pages 793-796, November.
    15. Heather A Piwowar & Roger S Day & Douglas B Fridsma, 2007. "Sharing Detailed Research Data Is Associated with Increased Citation Rate," PLOS ONE, Public Library of Science, vol. 2(3), pages 1-5, March.
    16. Katherine Casey & Rachel Glennerster & Edward Miguel, 2012. "Reshaping Institutions: Evidence on Aid Impacts Using a Preanalysis Plan," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 127(4), pages 1755-1812.
    17. Ryan Hill & Carolyn Stein & Heidi Williams, 2020. "Internalizing Externalities: Designing Effective Data Policies," AEA Papers and Proceedings, American Economic Association, vol. 110, pages 49-54, May.
    18. Fafchamps, Marcel & Labonne, Julien, 2017. "Using Split Samples to Improve Inference on Causal Effects," Political Analysis, Cambridge University Press, vol. 25(4), pages 465-482, October.
    19. Cristina Blanco-Perez & Abel Brodeur, 2020. "Publication Bias and Editorial Statement on Negative Findings," The Economic Journal, Royal Economic Society, vol. 130(629), pages 1226-1247.
    20. Michael L. Anderson & Jeremy Magruder, 2017. "Split-Sample Strategies for Avoiding False Discoveries," NBER Working Papers 23544, National Bureau of Economic Research, Inc.
    21. Fernando Hoces de la Guardia & Sean Grant & Edward Miguel, 2021. "A framework for open policy analysis," Science and Public Policy, Oxford University Press, vol. 48(2), pages 154-163.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Brodeur, Abel & Cook, Nikolai & Hartley, Jonathan & Heyes, Anthony, 2022. "Do Pre-Registration and Pre-analysis Plans Reduce p-Hacking and Publication Bias?," MetaArXiv uxf39, Center for Open Science.
    2. Josephson, Anna & Michler, Jeffrey D., 2018. "Viewpoint: Beasts of the field? Ethics in agricultural and applied economics," Food Policy, Elsevier, vol. 79(C), pages 1-11.
    3. Abel Brodeur & Scott Carrell & David Figlio & Lester Lusher, 2023. "Unpacking P-hacking and Publication Bias," American Economic Review, American Economic Association, vol. 113(11), pages 2974-3002, November.
    4. Sarah A. Janzen & Jeffrey D. Michler, 2021. "Ulysses' pact or Ulysses' raft: Using pre‐analysis plans in experimental and nonexperimental research," Applied Economic Perspectives and Policy, John Wiley & Sons, vol. 43(4), pages 1286-1304, December.
    5. Abel Brodeur & Nikolai M. Cook & Jonathan S. Hartley & Anthony Heyes, 2024. "Do Preregistration and Preanalysis Plans Reduce p-Hacking and Publication Bias? Evidence from 15,992 Test Statistics and Suggestions for Improvement," Journal of Political Economy Microeconomics, University of Chicago Press, vol. 2(3), pages 527-561.
    6. Abel Brodeur & Nikolai Cook & Carina Neisser, 2024. "p-Hacking, Data type and Data-Sharing Policy," The Economic Journal, Royal Economic Society, vol. 134(659), pages 985-1018.
    7. repec:osf:metaar:wkmht_v1 is not listed on IDEAS
    8. Zohid Askarov & Anthony Doucouli & Hristos Doucouli & T D Stanley, 2023. "The Significance of Data-Sharing Policy," Journal of the European Economic Association, European Economic Association, vol. 21(3), pages 1191-1226.
    9. Maurizio Canavari & Andreas C. Drichoutis & Jayson L. Lusk & Rodolfo M. Nayga, Jr., 2018. "How to run an experimental auction: A review of recent advances," Working Papers 2018-5, Agricultural University of Athens, Department Of Agricultural Economics.
    10. Brodeur, Abel & Cook, Nikolai & Heyes, Anthony, 2018. "Methods Matter: P-Hacking and Causal Inference in Economics," IZA Discussion Papers 11796, IZA Network @ LISER.
    11. Anna Dreber & Magnus Johannesson & Yifan Yang, 2024. "Selective reporting of placebo tests in top economics journals," Economic Inquiry, Western Economic Association International, vol. 62(3), pages 921-932, July.
    12. Anthony Doucouliagos & Hristos Doucouliagos & T. D. Stanley, 2024. "Power and bias in industrial relations research," British Journal of Industrial Relations, London School of Economics, vol. 62(1), pages 3-27, March.
    13. Ankel-Peters, Jörg & Fiala, Nathan & Neubauer, Florian, 2023. "Do economists replicate?," Journal of Economic Behavior & Organization, Elsevier, vol. 212(C), pages 219-232.
    14. Campbell, Douglas & Brodeur, Abel & Dreber, Anna & Johannesson, Magnus & Kopecky, Joseph & Lusher, Lester & Tsoy, Nikita, 2024. "The Robustness Reproducibility of the American Economic Review," I4R Discussion Paper Series 124, The Institute for Replication (I4R).
    15. Muhammad Haseeb & Kate Vyborny, 2016. "Imposing institutions: Evidence from cash transfer reform in Pakistan," CSAE Working Paper Series 2016-36, Centre for the Study of African Economies, University of Oxford.
    16. Balafoutas, Loukas & Celse, Jeremy & Karakostas, Alexandros & Umashev, Nicholas, 2025. "Incentives and the replication crisis in social sciences: A critical review of open science practices," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 114(C).
    17. Furukawa, Chishio, 2019. "Publication Bias under Aggregation Frictions: Theory, Evidence, and a New Correction Method," EconStor Preprints 194798, ZBW - Leibniz Information Centre for Economics.
    18. Anna Dreber & Magnus Johannesson, 2025. "A framework for evaluating reproducibility and replicability in economics," Economic Inquiry, Western Economic Association International, vol. 63(2), pages 338-356, April.
    19. Ferman, Bruno & Finamor, Lucas, 2025. "There must be an error here! Experimental evidence on coding errors' biases," I4R Discussion Paper Series 266, The Institute for Replication (I4R).
    20. Costanza Naguib, 2025. "Does single-blind review encourage or discourage p-hacking?," Diskussionsschriften dp2504, Universitaet Bern, Departement Volkswirtschaft.
    21. Castaing, Pauline & Gazeaud, Jules, 2025. "Do index insurance programs live up to their promises? Aggregating evidence from multiple experiments," Journal of Development Economics, Elsevier, vol. 175(C).

    More about this item

    Keywords

    ;
    ;
    ;
    ;
    ;
    ;
    ;

    JEL classification:

    • A11 - General Economics and Teaching - - General Economics - - - Role of Economics; Role of Economists
    • C12 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Hypothesis Testing: General
    • C80 - Mathematical and Quantitative Methods - - Data Collection and Data Estimation Methodology; Computer Programs - - - General
    • I23 - Health, Education, and Welfare - - Education - - - Higher Education; Research Institutions
    • A11 - General Economics and Teaching - - General Economics - - - Role of Economics; Role of Economists
    • C12 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Hypothesis Testing: General
    • C80 - Mathematical and Quantitative Methods - - Data Collection and Data Estimation Methodology; Computer Programs - - - General

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:cdl:econwp:qt7fc7s8cd. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Lisa Schiff (email available below). General contact details of provider: https://edirc.repec.org/data/ibbrkus.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.