IDEAS home Printed from https://ideas.repec.org/p/osf/metaar/ba7tr.html
   My bibliography  Save this paper

How Transparency and Reproducibility Can Increase Credibility in Policy Analysis: A Case Study of the Minimum Wage Policy Estimate

Author

Listed:
  • Hoces de la Guardia, Fernando

Abstract

The analysis of public policies, even when performed by the best non-partisan agencies, often lacks credibility (Manski, 2013). This allows policy makers to cherrypick between reports, or within a specific report, to select estimates that better match their beliefs. For example, in 2014 the Congressional Budget Office (CBO) produced a report on the effects of raising the minimum wage that was cited both by opponents and supporters of the policy, with each side accepting as credible only partial elements of the report. Lack of transparency and reproducibility (TR) in a policy report implies that its credibility relies on the reputation of the authors, and their organizations, instead of on a critical appraisal of the analysis. This dissertation translates to policy analysis solutions developed to address the lack of credibility in a different setting: the reproducibility crisis in science. I adapt the Transparency and Openness Promotion (TOP) guidelines (Nosek et al, 2015) to the policy analysis setting. The highest standards from the adapted guidelines involve the use of two key tools: dynamic documents that combine all elements of an analysis in one place, and open source version control (git). I then implement these high standards in a case study of the CBO report mentioned above, and present the complete analysis in the form of an open-source dynamic document. In addition to increasing the credibility of the case study analysis, this methodology brings attention to several components of the policy analysis that have been traditionally overlooked in academic research, for example the distribution of the losses used to pay for the increase in wages. Increasing our knowledge in these overlooked areas may prove most valuable to an evidence-based policy debate.

Suggested Citation

  • Hoces de la Guardia, Fernando, 2017. "How Transparency and Reproducibility Can Increase Credibility in Policy Analysis: A Case Study of the Minimum Wage Policy Estimate," MetaArXiv ba7tr, Center for Open Science.
  • Handle: RePEc:osf:metaar:ba7tr
    DOI: 10.31219/osf.io/ba7tr
    as

    Download full text from publisher

    File URL: https://osf.io/download/59ee1f3b6c613b02622f983b/
    Download Restriction: no

    File URL: https://libkey.io/10.31219/osf.io/ba7tr?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. John P A Ioannidis, 2005. "Why Most Published Research Findings Are False," PLOS Medicine, Public Library of Science, vol. 2(8), pages 1-1, August.
    2. Jesse Rothstein, 2015. "Teacher Quality Policy When Supply Matters," American Economic Review, American Economic Association, vol. 105(1), pages 100-130, January.
    3. null null, 2015. "A Review of:," Qualitative Research in Accounting & Management, Emerald Group Publishing, vol. 12(4), pages 452-454, October.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Fernando Hoces de la Guardia & Sean Grant & Edward Miguel, 2021. "A framework for open policy analysis," Science and Public Policy, Oxford University Press, vol. 48(2), pages 154-163.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Jyotirmoy Sarkar, 2018. "Will P†Value Triumph over Abuses and Attacks?," Biostatistics and Biometrics Open Access Journal, Juniper Publishers Inc., vol. 7(4), pages 66-71, July.
    2. Michael Bates & Michael Dinerstein & Andrew C. Johnston & Isaac Sorkin, 2022. "Teacher Labor Market Equilibrium and Student Achievement," CESifo Working Paper Series 9551, CESifo.
    3. Kevin J. Boyle & Mark Morrison & Darla Hatton MacDonald & Roderick Duncan & John Rose, 2016. "Investigating Internet and Mail Implementation of Stated-Preference Surveys While Controlling for Differences in Sample Frames," Environmental & Resource Economics, Springer;European Association of Environmental and Resource Economists, vol. 64(3), pages 401-419, July.
    4. Jelte M Wicherts & Marjan Bakker & Dylan Molenaar, 2011. "Willingness to Share Research Data Is Related to the Strength of the Evidence and the Quality of Reporting of Statistical Results," PLOS ONE, Public Library of Science, vol. 6(11), pages 1-7, November.
    5. Michela M. Tincani, 2021. "Teacher labor markets, school vouchers, and student cognitive achievement: Evidence from Chile," Quantitative Economics, Econometric Society, vol. 12(1), pages 173-216, January.
    6. Frederique Bordignon, 2020. "Self-correction of science: a comparative study of negative citations and post-publication peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 124(2), pages 1225-1239, August.
    7. Omar Al-Ubaydli & John A. List, 2015. "Do Natural Field Experiments Afford Researchers More or Less Control than Laboratory Experiments? A Simple Model," NBER Working Papers 20877, National Bureau of Economic Research, Inc.
    8. Aurelie Seguin & Wolfgang Forstmeier, 2012. "No Band Color Effects on Male Courtship Rate or Body Mass in the Zebra Finch: Four Experiments and a Meta-Analysis," PLOS ONE, Public Library of Science, vol. 7(6), pages 1-11, June.
    9. Dragana Radicic & Geoffrey Pugh & Hugo Hollanders & René Wintjes & Jon Fairburn, 2016. "The impact of innovation support programs on small and medium enterprises innovation in traditional manufacturing industries: An evaluation for seven European Union regions," Environment and Planning C, , vol. 34(8), pages 1425-1452, December.
    10. Colin F. Camerer & Anna Dreber & Felix Holzmeister & Teck-Hua Ho & Jürgen Huber & Magnus Johannesson & Michael Kirchler & Gideon Nave & Brian A. Nosek & Thomas Pfeiffer & Adam Altmejd & Nick Buttrick , 2018. "Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015," Nature Human Behaviour, Nature, vol. 2(9), pages 637-644, September.
    11. Li, Lunzheng & Maniadis, Zacharias & Sedikides, Constantine, 2021. "Anchoring in Economics: A Meta-Analysis of Studies on Willingness-To-Pay and Willingness-To-Accept," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 90(C).
    12. Diekmann Andreas, 2011. "Are Most Published Research Findings False?," Journal of Economics and Statistics (Jahrbuecher fuer Nationaloekonomie und Statistik), De Gruyter, vol. 231(5-6), pages 628-635, October.
    13. Daniele Fanelli, 2012. "Negative results are disappearing from most disciplines and countries," Scientometrics, Springer;Akadémiai Kiadó, vol. 90(3), pages 891-904, March.
    14. Kirthi Kalyanam & John McAteer & Jonathan Marek & James Hodges & Lifeng Lin, 2018. "Cross channel effects of search engine advertising on brick & mortar retail sales: Meta analysis of large scale field experiments on Google.com," Quantitative Marketing and Economics (QME), Springer, vol. 16(1), pages 1-42, March.
    15. Nazila Alinaghi & W. Robert Reed, 2021. "Taxes and Economic Growth in OECD Countries: A Meta-analysis," Public Finance Review, , vol. 49(1), pages 3-40, January.
    16. Isaiah Andrews & Maximilian Kasy, 2019. "Identification of and Correction for Publication Bias," American Economic Review, American Economic Association, vol. 109(8), pages 2766-2794, August.
    17. Michaelides, Michael, 2021. "Large sample size bias in empirical finance," Finance Research Letters, Elsevier, vol. 41(C).
    18. Michael L. Marlow, 2008. "Honestly, Who Else Would Fund Such Research? Reflections of a Non-Smoking Scholar," Econ Journal Watch, Econ Journal Watch, vol. 5(2), pages 240-268, May.
    19. Murphy, Richard & Weinhardt, Felix & Wyness, Gill, 2021. "Who teaches the teachers? A RCT of peer-to-peer observation and feedback in 181 schools," Economics of Education Review, Elsevier, vol. 82(C).
    20. Alexander Frankel & Maximilian Kasy, 2022. "Which Findings Should Be Published?," American Economic Journal: Microeconomics, American Economic Association, vol. 14(1), pages 1-38, February.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:osf:metaar:ba7tr. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: OSF (email available below). General contact details of provider: https://osf.io/preprints/metaarxiv .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.