IDEAS home Printed from https://ideas.repec.org/p/cdl/econwp/qt7fc7s8cd.html
   My bibliography  Save this paper

Evidence on Research Transparency in Economics

Author

Listed:
  • Miguel, Edward

Abstract

A decade ago, the term “research transparency” was not on economists' radar screen, but in a few short years a scholarly movement has emerged to bring new open science practices, tools and norms into the mainstream of our discipline. The goal of this article is to lay out the evidence on the adoption of these approaches – in three specific areas: open data, pre-registration and pre-analysis plans, and journal policies – and, more tentatively, begin to assess their impacts on the quality and credibility of economics research. The evidence to date indicates that economics (and related quantitative social science fields) are in a period of rapid transition toward new transparency-enhancing norms. While solid data on the benefits of these practices in economics is still limited, in part due to their relatively recent adoption, there is growing reason to believe that critics' worst fears regarding onerous adoption costs have not been realized. Finally, the article presents a set of frontier questions and potential innovations.

Suggested Citation

  • Miguel, Edward, 2021. "Evidence on Research Transparency in Economics," Department of Economics, Working Paper Series qt7fc7s8cd, Department of Economics, Institute for Business and Economic Research, UC Berkeley.
  • Handle: RePEc:cdl:econwp:qt7fc7s8cd
    as

    Download full text from publisher

    File URL: https://www.escholarship.org/uc/item/7fc7s8cd.pdf;origin=repeccitec
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Stefano DellaVigna & Devin Pope, 2018. "Predicting Experimental Results: Who Knows What?," Journal of Political Economy, University of Chicago Press, vol. 126(6), pages 2410-2456.
    2. Abel Brodeur & Nikolai Cook & Anthony Heyes, 2020. "Methods Matter: p-Hacking and Publication Bias in Causal Analysis in Economics," American Economic Review, American Economic Association, vol. 110(11), pages 3634-3660, November.
    3. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    4. Tom E. Hardwicke & John P. A. Ioannidis, 2018. "Mapping the universe of registered reports," Nature Human Behaviour, Nature, vol. 2(11), pages 793-796, November.
    5. Cristina Blanco-Perez & Abel Brodeur, 2020. "Publication Bias and Editorial Statement on Negative Findings," The Economic Journal, Royal Economic Society, vol. 130(629), pages 1226-1247.
    6. Burlig, Fiona, 2018. "Improving transparency in observational social science research: A pre-analysis plan approach," Economics Letters, Elsevier, vol. 168(C), pages 56-60.
    7. Nicholas Swanson & Garret Christensen & Rebecca Littman & David Birke & Edward Miguel & Elizabeth Levy Paluck & Zenan Wang, 2020. "Research Transparency Is on the Rise in Economics," AEA Papers and Proceedings, American Economic Association, vol. 110, pages 61-65, May.
    8. Lars Vilhuber, 2021. "AEA Data and Code Availability Policy," AEA Papers and Proceedings, American Economic Association, vol. 111, pages 818-823, May.
    9. Fafchamps, Marcel & Labonne, Julien, 2017. "Using Split Samples to Improve Inference on Causal Effects," Political Analysis, Cambridge University Press, vol. 25(4), pages 465-482, October.
    10. Michael L. Anderson & Jeremy Magruder, 2017. "Split-Sample Strategies for Avoiding False Discoveries," NBER Working Papers 23544, National Bureau of Economic Research, Inc.
    11. Marcel Fafchamps & Julien Labonne, 2016. "Using Split Samples to Improve Inference about Causal Effects," NBER Working Papers 21842, National Bureau of Economic Research, Inc.
    12. Katherine Casey & Rachel Glennerster & Edward Miguel, 2012. "Reshaping Institutions: Evidence on Aid Impacts Using a Preanalysis Plan," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 127(4), pages 1755-1812.
    13. Leamer, Edward E, 1983. "Let's Take the Con Out of Econometrics," American Economic Review, American Economic Association, vol. 73(1), pages 31-43, March.
    14. George K. Ofosu & Daniel N. Posner, 2020. "Do Pre-analysis Plans Hamper Publication?," AEA Papers and Proceedings, American Economic Association, vol. 110, pages 70-74, May.
    15. Paul Gertler & Sebastian Galiani & Mauricio Romero, 2018. "How to make replication the norm," Nature, Nature, vol. 554(7693), pages 417-419, February.
    16. Garret Christensen & Allan Dafoe & Edward Miguel & Don A Moore & Andrew K Rose, 2019. "A study of the impact of data sharing on article citations using journal policies as a natural experiment," PLOS ONE, Public Library of Science, vol. 14(12), pages 1-13, December.
    17. Fernando Hoces de la Guardia & Sean Grant & Edward Miguel, 2021. "A framework for open policy analysis," Science and Public Policy, Oxford University Press, vol. 48(2), pages 154-163.
    18. Jens Ludwig & Sendhil Mullainathan & Jann Spiess, 2019. "Augmenting Pre-Analysis Plans with Machine Learning," AEA Papers and Proceedings, American Economic Association, vol. 109, pages 71-76, May.
    19. Ofosu, George K. & Posner, Daniel N., 2020. "Do pre-analysis plans hamper publication?," LSE Research Online Documents on Economics 112748, London School of Economics and Political Science, LSE Library.
    20. Ryan Hill & Carolyn Stein & Heidi Williams, 2020. "Internalizing Externalities: Designing Effective Data Policies," AEA Papers and Proceedings, American Economic Association, vol. 110, pages 49-54, May.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Emilio Depetris-Chauvin & Felipe González, 2023. "The Political Consequences of Vaccines: Quasi-experimental Evidence from Eligibility Rules," Documentos de Trabajo 572, Instituto de Economia. Pontificia Universidad Católica de Chile..
    2. Brodeur, Abel & Cook, Nikolai & Hartley, Jonathan & Heyes, Anthony, 2022. "Do Pre-Registration and Pre-analysis Plans Reduce p-Hacking and Publication Bias?," MetaArXiv uxf39, Center for Open Science.
    3. Timothy B. Armstrong & Patrick Kline & Liyang Sun, 2023. "Adapting to Misspecification," Papers 2305.14265, arXiv.org, revised Jul 2023.
    4. Felix Chopra & Ingar Haaland & Christopher Roth & Andreas Stegmann, 2023. "The Null Result Penalty," The Economic Journal, Royal Economic Society, vol. 134(657), pages 193-219.
    5. Pippenger, John, 2022. "The Law Of One Price, Borders And Purchasing Power Parity," University of California at Santa Barbara, Economics Working Paper Series qt5b17d1dr, Department of Economics, UC Santa Barbara.
    6. Thibaut Arpinon & Romain Espinosa, 2023. "A practical guide to Registered Reports for economists," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 9(1), pages 90-122, June.
    7. , 2023. "The Political Consequences of Vaccines: Quasi-experimental Evidence from Eligibility Rules," Working Papers 953, Queen Mary University of London, School of Economics and Finance.
    8. Ankel-Peters, Jörg & Fiala, Nathan & Neubauer, Florian, 2023. "Do economists replicate?," Journal of Economic Behavior & Organization, Elsevier, vol. 212(C), pages 219-232.
    9. Christophe Pérignon & Olivier Akmansoy & Christophe Hurlin & Anna Dreber & Felix Holzmeister & Juergen Huber & Magnus Johanneson & Michael Kirchler & Albert Menkveld & Michael Razen & Utz Weitzel, 2022. "Reproducibility of Empirical Results: Evidence from 1,000 Tests in Finance," Working Papers hal-03810013, HAL.
    10. Maximilian Kasy & Jann Spiess, 2022. "Optimal Pre-Analysis Plans: Statistical Decisions Subject to Implementability," Papers 2208.09638, arXiv.org, revised Oct 2023.
    11. Maximilian Kasy & Jann Spiess, 2022. "Rationalizing Pre-Analysis Plans:Statistical Decisions Subject to Implementability," Economics Series Working Papers 975, University of Oxford, Department of Economics.
    12. Robert Kirkby, 2023. "Quantitative Macroeconomics: Lessons Learned from Fourteen Replications," Computational Economics, Springer;Society for Computational Economics, vol. 61(2), pages 875-896, February.
    13. Aldy, Joseph E., 2022. "Learning How to Build Back Better through Clean Energy Policy Evaluation," RFF Working Paper Series 22-15, Resources for the Future.
    14. Thibaut Arpinon & Romain Espinosa, 2023. "A Practical Guide to Registered Reports for Economists," Post-Print halshs-03897719, HAL.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Brodeur, Abel & Cook, Nikolai & Hartley, Jonathan & Heyes, Anthony, 2022. "Do Pre-Registration and Pre-analysis Plans Reduce p-Hacking and Publication Bias?," MetaArXiv uxf39, Center for Open Science.
    2. Josephson, Anna & Michler, Jeffrey D., 2018. "Viewpoint: Beasts of the field? Ethics in agricultural and applied economics," Food Policy, Elsevier, vol. 79(C), pages 1-11.
    3. Sarah A. Janzen & Jeffrey D. Michler, 2021. "Ulysses' pact or Ulysses' raft: Using pre‐analysis plans in experimental and nonexperimental research," Applied Economic Perspectives and Policy, John Wiley & Sons, vol. 43(4), pages 1286-1304, December.
    4. Brodeur, Abel & Cook, Nikolai & Neisser, Carina, 2022. "P-Hacking, Data Type and Data-Sharing Policy," IZA Discussion Papers 15586, Institute of Labor Economics (IZA).
    5. Dreber, Anna & Johannesson, Magnus, 2023. "A framework for evaluating reproducibility and replicability in economics," Ruhr Economic Papers 1055, RWI - Leibniz-Institut für Wirtschaftsforschung, Ruhr-University Bochum, TU Dortmund University, University of Duisburg-Essen.
    6. Maurizio Canavari & Andreas C. Drichoutis & Jayson L. Lusk & Rodolfo M. Nayga, Jr., 2018. "How to run an experimental auction: A review of recent advances," Working Papers 2018-5, Agricultural University of Athens, Department Of Agricultural Economics.
    7. Brodeur, Abel & Cook, Nikolai & Heyes, Anthony, 2018. "Methods Matter: P-Hacking and Causal Inference in Economics," IZA Discussion Papers 11796, Institute of Labor Economics (IZA).
    8. Brodeur, Abel & Cook, Nikolai M. & Hartley, Jonathan S. & Heyes, Anthony, 2023. "Do Pre-Registration and Pre-Analysis Plans Reduce p-Hacking and Publication Bias?: Evidence from 15,992 Test Statistics and Suggestions for Improvement," GLO Discussion Paper Series 1147 [pre.], Global Labor Organization (GLO).
    9. Ankel-Peters, Jörg & Fiala, Nathan & Neubauer, Florian, 2023. "Do economists replicate?," Journal of Economic Behavior & Organization, Elsevier, vol. 212(C), pages 219-232.
    10. Muhammad Haseeb & Kate Vyborny, 2016. "Imposing institutions: Evidence from cash transfer reform in Pakistan," CSAE Working Paper Series 2016-36, Centre for the Study of African Economies, University of Oxford.
    11. Furukawa, Chishio, 2019. "Publication Bias under Aggregation Frictions: Theory, Evidence, and a New Correction Method," EconStor Preprints 194798, ZBW - Leibniz Information Centre for Economics.
    12. Katherine Casey & Rachel Glennerster & Edward Miguel & Maarten Voors, 2023. "Skill Versus Voice in Local Development," The Review of Economics and Statistics, MIT Press, vol. 105(2), pages 311-326, March.
    13. Brodeur, Abel & Cook, Nikolai & Heyes, Anthony, 2022. "We Need to Talk about Mechanical Turk: What 22,989 Hypothesis Tests Tell Us about Publication Bias and p-Hacking in Online Experiments," IZA Discussion Papers 15478, Institute of Labor Economics (IZA).
    14. Burlig, Fiona, 2018. "Improving transparency in observational social science research: A pre-analysis plan approach," Economics Letters, Elsevier, vol. 168(C), pages 56-60.
    15. Brodeur, Abel & Cook, Nikolai & Heyes, Anthony, 2022. "We Need to Talk about Mechanical Turk: What 22,989 Hypothesis Tests Tell us about p-Hacking and Publication Bias in Online Experiments," GLO Discussion Paper Series 1157, Global Labor Organization (GLO).
    16. Abel Brodeur & Scott Carrell & David Figlio & Lester Lusher, 2023. "Unpacking P-hacking and Publication Bias," American Economic Review, American Economic Association, vol. 113(11), pages 2974-3002, November.
    17. Brodeur, Abel & Esterling, Kevin & Ankel-Peters, Jörg & Bueno, Natália S. & Desposato, Scott & Dreber, Anna & Genovese, Federica & Green, Donald P. & Hepplewhite, Matthew & Hoces de la Guardia, Fernan, 2024. "Promoting Reproducibility and Replicability in Political Science," I4R Discussion Paper Series 100, The Institute for Replication (I4R).
    18. Felix Chopra & Ingar Haaland & Christopher Roth & Andreas Stegmann, 2023. "The Null Result Penalty," The Economic Journal, Royal Economic Society, vol. 134(657), pages 193-219.
    19. Graham Elliott & Nikolay Kudrin & Kaspar Wuthrich, 2022. "The Power of Tests for Detecting $p$-Hacking," Papers 2205.07950, arXiv.org, revised Apr 2024.
    20. Benjamin A. Olken, 2015. "Promises and Perils of Pre-analysis Plans," Journal of Economic Perspectives, American Economic Association, vol. 29(3), pages 61-80, Summer.

    More about this item

    Keywords

    Economics; Applied Economics; Clinical Research; Behavioral and Social Science; Applied economics; Econometrics; Economic theory;
    All these keywords.

    JEL classification:

    • A11 - General Economics and Teaching - - General Economics - - - Role of Economics; Role of Economists
    • C12 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Hypothesis Testing: General
    • C80 - Mathematical and Quantitative Methods - - Data Collection and Data Estimation Methodology; Computer Programs - - - General
    • I23 - Health, Education, and Welfare - - Education - - - Higher Education; Research Institutions
    • A11 - General Economics and Teaching - - General Economics - - - Role of Economics; Role of Economists
    • C12 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Hypothesis Testing: General
    • C80 - Mathematical and Quantitative Methods - - Data Collection and Data Estimation Methodology; Computer Programs - - - General

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:cdl:econwp:qt7fc7s8cd. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Lisa Schiff (email available below). General contact details of provider: https://edirc.repec.org/data/ibbrkus.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.