IDEAS home Printed from https://ideas.repec.org/p/osf/metaar/ba7tr.html

How Transparency and Reproducibility Can Increase Credibility in Policy Analysis: A Case Study of the Minimum Wage Policy Estimate

Author

Listed:
  • Hoces de la Guardia, Fernando

Abstract

The analysis of public policies, even when performed by the best non-partisan agencies, often lacks credibility (Manski, 2013). This allows policy makers to cherrypick between reports, or within a specific report, to select estimates that better match their beliefs. For example, in 2014 the Congressional Budget Office (CBO) produced a report on the effects of raising the minimum wage that was cited both by opponents and supporters of the policy, with each side accepting as credible only partial elements of the report. Lack of transparency and reproducibility (TR) in a policy report implies that its credibility relies on the reputation of the authors, and their organizations, instead of on a critical appraisal of the analysis. This dissertation translates to policy analysis solutions developed to address the lack of credibility in a different setting: the reproducibility crisis in science. I adapt the Transparency and Openness Promotion (TOP) guidelines (Nosek et al, 2015) to the policy analysis setting. The highest standards from the adapted guidelines involve the use of two key tools: dynamic documents that combine all elements of an analysis in one place, and open source version control (git). I then implement these high standards in a case study of the CBO report mentioned above, and present the complete analysis in the form of an open-source dynamic document. In addition to increasing the credibility of the case study analysis, this methodology brings attention to several components of the policy analysis that have been traditionally overlooked in academic research, for example the distribution of the losses used to pay for the increase in wages. Increasing our knowledge in these overlooked areas may prove most valuable to an evidence-based policy debate.

Suggested Citation

  • Hoces de la Guardia, Fernando, 2017. "How Transparency and Reproducibility Can Increase Credibility in Policy Analysis: A Case Study of the Minimum Wage Policy Estimate," MetaArXiv ba7tr, Center for Open Science.
  • Handle: RePEc:osf:metaar:ba7tr
    DOI: 10.31219/osf.io/ba7tr
    as

    Download full text from publisher

    File URL: https://osf.io/download/59ee1f3b6c613b02622f983b/
    Download Restriction: no

    File URL: https://libkey.io/10.31219/osf.io/ba7tr?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. null null, 2015. "A Review of:," Qualitative Research in Accounting & Management, Emerald Group Publishing, vol. 12(4), pages 452-454, October.
    2. John P A Ioannidis, 2005. "Why Most Published Research Findings Are False," PLOS Medicine, Public Library of Science, vol. 2(8), pages 1-1, August.
    3. Jesse Rothstein, 2015. "Teacher Quality Policy When Supply Matters," American Economic Review, American Economic Association, vol. 105(1), pages 100-130, January.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Fernando Hoces de la Guardia & Sean Grant & Edward Miguel, 2021. "A framework for open policy analysis," Science and Public Policy, Oxford University Press, vol. 48(2), pages 154-163.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Pierre J C Chuard & Milan Vrtílek & Megan L Head & Michael D Jennions, 2019. "Evidence that nonsignificant results are sometimes preferred: Reverse P-hacking or selective reporting?," PLOS Biology, Public Library of Science, vol. 17(1), pages 1-7, January.
    2. Murphy, Richard & Weinhardt, Felix & Wyness, Gill, 2021. "Who teaches the teachers? A RCT of peer-to-peer observation and feedback in 181 schools," Economics of Education Review, Elsevier, vol. 82(C).
    3. Gillian L Currie & Helena N Angel-Scott & Lesley Colvin & Fala Cramond & Kaitlyn Hair & Laila Khandoker & Jing Liao & Malcolm Macleod & Sarah K McCann & Rosie Morland & Nicki Sherratt & Robert Stewart, 2019. "Animal models of chemotherapy-induced peripheral neuropathy: A machine-assisted systematic review and meta-analysis," PLOS Biology, Public Library of Science, vol. 17(5), pages 1-34, May.
    4. Mueller-Langer, Frank & Fecher, Benedikt & Harhoff, Dietmar & Wagner, Gert G., 2019. "Replication studies in economics—How many and which papers are chosen for replication, and why?," EconStor Open Access Articles and Book Chapters, ZBW - Leibniz Information Centre for Economics, vol. 48(1), pages 62-83.
    5. Laverde, Mariana & Mykerezi, Elton & Sojourner, Aaron & Sood, Aradhya, 2026. "Match Effects and the Gains from Alternative Job Assignments: Evidence from a Teacher Labor Market," IZA Discussion Papers 18397, IZA Network @ LISER.
    6. Stefan Stieglitz & Christian Meske & Björn Ross & Milad Mirbabaie, 2020. "Going Back in Time to Predict the Future - The Complex Role of the Data Collection Period in Social Media Analytics," Information Systems Frontiers, Springer, vol. 22(2), pages 395-409, April.
    7. Bernhard Voelkl & Lucile Vogt & Emily S Sena & Hanno Würbel, 2018. "Reproducibility of preclinical animal research improves with heterogeneity of study samples," PLOS Biology, Public Library of Science, vol. 16(2), pages 1-13, February.
    8. Keith R Lohse & Kristin L Sainani & J Andrew Taylor & Michael L Butson & Emma J Knight & Andrew J Vickers, 2020. "Systematic review of the use of “magnitude-based inference” in sports science and medicine," PLOS ONE, Public Library of Science, vol. 15(6), pages 1-22, June.
    9. Charles F. Manski, 2017. "Improving Clinical Guidelines and Decisions under Uncertainty," NBER Working Papers 23915, National Bureau of Economic Research, Inc.
    10. Erik Snowberg & Leeat Yariv, 2018. "Testing the Waters: Behavior across Participant Pools," CESifo Working Paper Series 7136, CESifo.
    11. Brewer, Mike & Crossley, Thomas F. & Zilio, Federico, 2019. "What Do We Really Know about the Employment Effects of the UK's National Minimum Wage?," IZA Discussion Papers 12369, IZA Network @ LISER.
    12. Daniele Fanelli & Wolfgang Glänzel, 2013. "Bibliometric Evidence for a Hierarchy of the Sciences," PLOS ONE, Public Library of Science, vol. 8(6), pages 1-11, June.
    13. Thierry Poynard & Dominique Thabut & Mona Munteanu & Vlad Ratziu & Yves Benhamou & Olivier Deckmyn, 2010. "Hirsch Index and Truth Survival in Clinical Research," PLOS ONE, Public Library of Science, vol. 5(8), pages 1-10, August.
    14. Alexander Frankel & Maximilian Kasy, 2022. "Which Findings Should Be Published?," American Economic Journal: Microeconomics, American Economic Association, vol. 14(1), pages 1-38, February.
    15. Sean Corcoran & Dan Goldhaber, 2013. "Value Added and Its Uses: Where You Stand Depends on Where You Sit," Education Finance and Policy, MIT Press, vol. 8(3), pages 418-434, July.
    16. Jyotirmoy Sarkar, 2018. "Will P†Value Triumph over Abuses and Attacks?," Biostatistics and Biometrics Open Access Journal, Juniper Publishers Inc., vol. 7(4), pages 66-71, July.
    17. Valérie Orozco & Christophe Bontemps & Élise Maigné & Virginie Piguet & Annie Hofstetter & Anne Marie Lacroix & Fabrice Levert & Jean-Marc Rousselle, 2017. "How to make a pie? Reproducible Research for Empirical Economics & Econometrics," Post-Print hal-01939942, HAL.
    18. Hinrichs, Peter, 2021. "What kind of teachers are schools looking for? Evidence from a randomized field experiment," Journal of Economic Behavior & Organization, Elsevier, vol. 186(C), pages 395-411.
    19. Stephen Fox, 2016. "Dismantling The Box — Applying Principles For Reducing Preconceptions During Ideation," International Journal of Innovation Management (ijim), World Scientific Publishing Co. Pte. Ltd., vol. 20(06), pages 1-27, August.
    20. Andrew D Higginson & Marcus R Munafò, 2016. "Current Incentives for Scientists Lead to Underpowered Studies with Erroneous Conclusions," PLOS Biology, Public Library of Science, vol. 14(11), pages 1-14, November.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:osf:metaar:ba7tr. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: OSF (email available below). General contact details of provider: https://osf.io/preprints/metaarxiv .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.