IDEAS home Printed from https://ideas.repec.org/p/cdl/econwp/qt7fc7s8cd.html
   My bibliography  Save this paper

Evidence on Research Transparency in Economics

Author

Listed:
  • Miguel, Edward

Abstract

A decade ago, the term “research transparency” was not on economists' radar screen, but in a few short years a scholarly movement has emerged to bring new open science practices, tools and norms into the mainstream of our discipline. The goal of this article is to lay out the evidence on the adoption of these approaches – in three specific areas: open data, pre-registration and pre-analysis plans, and journal policies – and, more tentatively, begin to assess their impacts on the quality and credibility of economics research. The evidence to date indicates that economics (and related quantitative social science fields) are in a period of rapid transition toward new transparency-enhancing norms. While solid data on the benefits of these practices in economics is still limited, in part due to their relatively recent adoption, there is growing reason to believe that critics' worst fears regarding onerous adoption costs have not been realized. Finally, the article presents a set of frontier questions and potential innovations.

Suggested Citation

  • Miguel, Edward, 2021. "Evidence on Research Transparency in Economics," Department of Economics, Working Paper Series qt7fc7s8cd, Department of Economics, Institute for Business and Economic Research, UC Berkeley.
  • Handle: RePEc:cdl:econwp:qt7fc7s8cd
    as

    Download full text from publisher

    File URL: https://www.escholarship.org/uc/item/7fc7s8cd.pdf;origin=repeccitec
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Stefano DellaVigna & Devin Pope, 2018. "Predicting Experimental Results: Who Knows What?," Journal of Political Economy, University of Chicago Press, vol. 126(6), pages 2410-2456.
    2. Abel Brodeur & Nikolai Cook & Anthony Heyes, 2020. "Methods Matter: p-Hacking and Publication Bias in Causal Analysis in Economics," American Economic Review, American Economic Association, vol. 110(11), pages 3634-3660, November.
    3. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    4. Tom E. Hardwicke & John P. A. Ioannidis, 2018. "Mapping the universe of registered reports," Nature Human Behaviour, Nature, vol. 2(11), pages 793-796, November.
    5. Cristina Blanco-Perez & Abel Brodeur, 2020. "Publication Bias and Editorial Statement on Negative Findings," The Economic Journal, Royal Economic Society, vol. 130(629), pages 1226-1247.
    6. Burlig, Fiona, 2018. "Improving transparency in observational social science research: A pre-analysis plan approach," Economics Letters, Elsevier, vol. 168(C), pages 56-60.
    7. Nicholas Swanson & Garret Christensen & Rebecca Littman & David Birke & Edward Miguel & Elizabeth Levy Paluck & Zenan Wang, 2020. "Research Transparency Is on the Rise in Economics," AEA Papers and Proceedings, American Economic Association, vol. 110, pages 61-65, May.
    8. Lars Vilhuber, 2021. "AEA Data and Code Availability Policy," AEA Papers and Proceedings, American Economic Association, vol. 111, pages 818-823, May.
    9. Fafchamps, Marcel & Labonne, Julien, 2017. "Using Split Samples to Improve Inference on Causal Effects," Political Analysis, Cambridge University Press, vol. 25(4), pages 465-482, October.
    10. Michael L. Anderson & Jeremy Magruder, 2017. "Split-Sample Strategies for Avoiding False Discoveries," NBER Working Papers 23544, National Bureau of Economic Research, Inc.
    11. Marcel Fafchamps & Julien Labonne, 2016. "Using Split Samples to Improve Inference about Causal Effects," NBER Working Papers 21842, National Bureau of Economic Research, Inc.
    12. Katherine Casey & Rachel Glennerster & Edward Miguel, 2012. "Reshaping Institutions: Evidence on Aid Impacts Using a Preanalysis Plan," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 127(4), pages 1755-1812.
    13. Leamer, Edward E, 1983. "Let's Take the Con Out of Econometrics," American Economic Review, American Economic Association, vol. 73(1), pages 31-43, March.
    14. George K. Ofosu & Daniel N. Posner, 2020. "Do Pre-analysis Plans Hamper Publication?," AEA Papers and Proceedings, American Economic Association, vol. 110, pages 70-74, May.
    15. Paul Gertler & Sebastian Galiani & Mauricio Romero, 2018. "How to make replication the norm," Nature, Nature, vol. 554(7693), pages 417-419, February.
    16. Garret Christensen & Allan Dafoe & Edward Miguel & Don A Moore & Andrew K Rose, 2019. "A study of the impact of data sharing on article citations using journal policies as a natural experiment," PLOS ONE, Public Library of Science, vol. 14(12), pages 1-13, December.
    17. Fernando Hoces de la Guardia & Sean Grant & Edward Miguel, 2021. "A framework for open policy analysis," Science and Public Policy, Oxford University Press, vol. 48(2), pages 154-163.
    18. Jens Ludwig & Sendhil Mullainathan & Jann Spiess, 2019. "Augmenting Pre-Analysis Plans with Machine Learning," AEA Papers and Proceedings, American Economic Association, vol. 109, pages 71-76, May.
    19. Ofosu, George K. & Posner, Daniel N., 2020. "Do pre-analysis plans hamper publication?," LSE Research Online Documents on Economics 112748, London School of Economics and Political Science, LSE Library.
    20. Ryan Hill & Carolyn Stein & Heidi Williams, 2020. "Internalizing Externalities: Designing Effective Data Policies," AEA Papers and Proceedings, American Economic Association, vol. 110, pages 49-54, May.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Timothy B. Armstrong & Patrick Kline & Liyang Sun, 2023. "Adapting to Misspecification," Papers 2305.14265, arXiv.org, revised Aug 2024.
    2. Pippenger, John, 2022. "The Law Of One Price, Borders And Purchasing Power Parity," University of California at Santa Barbara, Economics Working Paper Series qt5b17d1dr, Department of Economics, UC Santa Barbara.
    3. Rose, Julian & Neubauer, Florian & Ankel-Peters, Jörg, 2024. "Long-Term Effects of the Targeting the Ultra-Poor Program - A Reproducibility and Replicability Assessment of Banerjee et al. (2021)," I4R Discussion Paper Series 142, The Institute for Replication (I4R).
    4. Ankel-Peters, Jörg & Fiala, Nathan & Neubauer, Florian, 2023. "Do economists replicate?," Journal of Economic Behavior & Organization, Elsevier, vol. 212(C), pages 219-232.
    5. Brodeur, Abel & Cook, Nikolai M. & Hartley, Jonathan S. & Heyes, Anthony, 2022. "Do Pre-Registration and Pre-analysis Plans Reduce p-Hacking and Publication Bias?," GLO Discussion Paper Series 1147, Global Labor Organization (GLO).
    6. Maximilian Kasy & Jann Spiess, 2022. "Rationalizing Pre-Analysis Plans:Statistical Decisions Subject to Implementability," Economics Series Working Papers 975, University of Oxford, Department of Economics.
    7. Aldy, Joseph E., 2022. "Learning How to Build Back Better through Clean Energy Policy Evaluation," RFF Working Paper Series 22-15, Resources for the Future.
    8. Prashant Garg & Thiemo Fetzer, 2024. "Causal Claims in Economics," CESifo Working Paper Series 11462, CESifo.
    9. Emilio Depetris-Chauvin & Felipe González, 2023. "The Political Consequences of Vaccines: Quasi-experimental Evidence from Eligibility Rules," Documentos de Trabajo 572, Instituto de Economia. Pontificia Universidad Católica de Chile..
    10. Maximilian Kasy & Jann Spiess, 2022. "Optimal Pre-Analysis Plans: Statistical Decisions Subject to Implementability," Papers 2208.09638, arXiv.org, revised Jul 2024.
    11. Felix Chopra & Ingar Haaland & Christopher Roth & Andreas Stegmann, 2024. "The Null Result Penalty," The Economic Journal, Royal Economic Society, vol. 134(657), pages 193-219.
    12. Thibaut Arpinon & Romain Espinosa, 2023. "A practical guide to Registered Reports for economists," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 9(1), pages 90-122, June.
    13. Christophe Pérignon & Olivier Akmansoy & Christophe Hurlin & Anna Dreber & Felix Holzmeister & Juergen Huber & Magnus Johanneson & Michael Kirchler & Albert Menkveld & Michael Razen & Utz Weitzel, 2022. "Reproducibility of Empirical Results: Evidence from 1,000 Tests in Finance," Working Papers hal-03810013, HAL.
    14. Robert Kirkby, 2023. "Quantitative Macroeconomics: Lessons Learned from Fourteen Replications," Computational Economics, Springer;Society for Computational Economics, vol. 61(2), pages 875-896, February.
    15. , 2023. "The Political Consequences of Vaccines: Quasi-experimental Evidence from Eligibility Rules," Working Papers 953, Queen Mary University of London, School of Economics and Finance.
    16. Thibaut Arpinon & Marianne Lefebvre, 2024. "Registered Reports and Associated Benefits for Agricultural Economics," Post-Print hal-04635986, HAL.
    17. Thibaut Arpinon & Romain Espinosa, 2023. "A Practical Guide to Registered Reports for Economists," Post-Print halshs-03897719, HAL.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Brodeur, Abel & Cook, Nikolai & Hartley, Jonathan & Heyes, Anthony, 2022. "Do Pre-Registration and Pre-analysis Plans Reduce p-Hacking and Publication Bias?," MetaArXiv uxf39, Center for Open Science.
    2. Abel Brodeur & Scott Carrell & David Figlio & Lester Lusher, 2023. "Unpacking P-hacking and Publication Bias," American Economic Review, American Economic Association, vol. 113(11), pages 2974-3002, November.
    3. Josephson, Anna & Michler, Jeffrey D., 2018. "Viewpoint: Beasts of the field? Ethics in agricultural and applied economics," Food Policy, Elsevier, vol. 79(C), pages 1-11.
    4. Sarah A. Janzen & Jeffrey D. Michler, 2021. "Ulysses' pact or Ulysses' raft: Using pre‐analysis plans in experimental and nonexperimental research," Applied Economic Perspectives and Policy, John Wiley & Sons, vol. 43(4), pages 1286-1304, December.
    5. Abel Brodeur & Nikolai Cook & Carina Neisser, 2024. "p-Hacking, Data type and Data-Sharing Policy," The Economic Journal, Royal Economic Society, vol. 134(659), pages 985-1018.
    6. Dreber, Anna & Johannesson, Magnus, 2023. "A framework for evaluating reproducibility and replicability in economics," Ruhr Economic Papers 1055, RWI - Leibniz-Institut für Wirtschaftsforschung, Ruhr-University Bochum, TU Dortmund University, University of Duisburg-Essen.
    7. Maurizio Canavari & Andreas C. Drichoutis & Jayson L. Lusk & Rodolfo M. Nayga, Jr., 2018. "How to run an experimental auction: A review of recent advances," Working Papers 2018-5, Agricultural University of Athens, Department Of Agricultural Economics.
    8. Brodeur, Abel & Cook, Nikolai & Heyes, Anthony, 2018. "Methods Matter: P-Hacking and Causal Inference in Economics," IZA Discussion Papers 11796, Institute of Labor Economics (IZA).
    9. Abel Brodeur & Nikolai M. Cook & Jonathan S. Hartley & Anthony Heyes, 2024. "Do Preregistration and Preanalysis Plans Reduce p-Hacking and Publication Bias? Evidence from 15,992 Test Statistics and Suggestions for Improvement," Journal of Political Economy Microeconomics, University of Chicago Press, vol. 2(3), pages 527-561.
    10. Ankel-Peters, Jörg & Fiala, Nathan & Neubauer, Florian, 2023. "Do economists replicate?," Journal of Economic Behavior & Organization, Elsevier, vol. 212(C), pages 219-232.
    11. Campbell, Douglas & Brodeur, Abel & Dreber, Anna & Johannesson, Magnus & Kopecky, Joseph & Lusher, Lester & Tsoy, Nikita, 2024. "The Robustness Reproducibility of the American Economic Review," I4R Discussion Paper Series 124, The Institute for Replication (I4R).
    12. Muhammad Haseeb & Kate Vyborny, 2016. "Imposing institutions: Evidence from cash transfer reform in Pakistan," CSAE Working Paper Series 2016-36, Centre for the Study of African Economies, University of Oxford.
    13. Furukawa, Chishio, 2019. "Publication Bias under Aggregation Frictions: Theory, Evidence, and a New Correction Method," EconStor Preprints 194798, ZBW - Leibniz Information Centre for Economics.
    14. Katherine Casey & Rachel Glennerster & Edward Miguel & Maarten Voors, 2023. "Skill Versus Voice in Local Development," The Review of Economics and Statistics, MIT Press, vol. 105(2), pages 311-326, March.
    15. Abel Brodeur, Nikolai M. Cook, Anthony Heyes, 2022. "We Need to Talk about Mechanical Turk: What 22,989 Hypothesis Tests Tell Us about Publication Bias and p-Hacking in Online Experiments," LCERPA Working Papers am0133, Laurier Centre for Economic Research and Policy Analysis.
    16. Adam Gorajek & Joel Bank & Andrew Staib & Benjamin Malin & Hamish Fitchett, 2021. "Star Wars at Central Banks," RBA Research Discussion Papers rdp2021-02, Reserve Bank of Australia.
    17. Burlig, Fiona, 2018. "Improving transparency in observational social science research: A pre-analysis plan approach," Economics Letters, Elsevier, vol. 168(C), pages 56-60.
    18. Brodeur, Abel & Cook, Nikolai & Heyes, Anthony, 2022. "We Need to Talk about Mechanical Turk: What 22,989 Hypothesis Tests Tell us about p-Hacking and Publication Bias in Online Experiments," GLO Discussion Paper Series 1157, Global Labor Organization (GLO).
    19. Brodeur, Abel & Esterling, Kevin & Ankel-Peters, Jörg & Bueno, Natália S & Desposato, Scott & Dreber, Anna & Genovese, Federica & Green, Donald P & Hepplewhite, Matthew & de la Guardia, Fernando Hoces, 2024. "Promoting Reproducibility and Replicability in Political Science," Department of Economics, Working Paper Series qt23n3n3dg, Department of Economics, Institute for Business and Economic Research, UC Berkeley.
    20. Anna Dreber & Magnus Johannesson & Yifan Yang, 2024. "Selective reporting of placebo tests in top economics journals," Economic Inquiry, Western Economic Association International, vol. 62(3), pages 921-932, July.

    More about this item

    Keywords

    Economics; Applied Economics; Clinical Research; Behavioral and Social Science; Applied economics; Econometrics; Economic theory;
    All these keywords.

    JEL classification:

    • A11 - General Economics and Teaching - - General Economics - - - Role of Economics; Role of Economists
    • C12 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Hypothesis Testing: General
    • C80 - Mathematical and Quantitative Methods - - Data Collection and Data Estimation Methodology; Computer Programs - - - General
    • I23 - Health, Education, and Welfare - - Education - - - Higher Education; Research Institutions
    • A11 - General Economics and Teaching - - General Economics - - - Role of Economics; Role of Economists
    • C12 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Hypothesis Testing: General
    • C80 - Mathematical and Quantitative Methods - - Data Collection and Data Estimation Methodology; Computer Programs - - - General

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:cdl:econwp:qt7fc7s8cd. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Lisa Schiff (email available below). General contact details of provider: https://edirc.repec.org/data/ibbrkus.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.