IDEAS home Printed from https://ideas.repec.org/p/osf/metaar/jnyqh.html
   My bibliography  Save this paper

A Framework for Open Policy Analysis

Author

Listed:
  • Hoces de la Guardia, Fernando
  • Grant, Sean

    (Indiana University)

  • Miguel, Edward

Abstract

The evidence-based policy movement promotes the use of empirical evidence to inform policy decision-making. While this movement has gained traction over the last two decades, concerns about the credibility of empirical research have been identified in scientific disciplines that use research methods and practices that are commonplace in policy analysis. As a solution, we argue that policy analysis should adopt the transparent, open, and reproducible research practices increasingly espoused in related disciplines. We first discuss the importance of evidence-based policy in an era of increasing disagreement about facts, analysis, and expertise. We review recent credibility crises of empirical research, and their relevance to the credibility of evidence-based policy. We then make the case for “open” policy analysis (OPA) and how to achieve it, focusing on examples of recent policy analyses that have incorporated open research practices such as transparent reporting, open data, and code sharing. We conclude with recommendations on how key stakeholders in evidence-based policy can make OPA the norm and thus safeguard trust in using empirical evidence to inform important policy decisions.

Suggested Citation

  • Hoces de la Guardia, Fernando & Grant, Sean & Miguel, Edward, 2018. "A Framework for Open Policy Analysis," MetaArXiv jnyqh, Center for Open Science.
  • Handle: RePEc:osf:metaar:jnyqh
    DOI: 10.31219/osf.io/jnyqh
    as

    Download full text from publisher

    File URL: https://osf.io/download/5ab8ef7e76e58c000dfab8b2/
    Download Restriction: no

    File URL: https://libkey.io/10.31219/osf.io/jnyqh?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Charles F. Manski, 2013. "Response to the Review of ‘Public Policy in an Uncertain World’," Economic Journal, Royal Economic Society, vol. 0, pages 412-415, August.
    2. Joshua D. Angrist & Jörn-Steffen Pischke, 2010. "The Credibility Revolution in Empirical Economics: How Better Research Design Is Taking the Con out of Econometrics," Journal of Economic Perspectives, American Economic Association, vol. 24(2), pages 3-30, Spring.
    3. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    4. Gerber, Alan & Malhotra, Neil, 2008. "Do Statistical Reporting Standards Affect What Is Published? Publication Bias in Two Leading Political Science Journals," Quarterly Journal of Political Science, now publishers, vol. 3(3), pages 313-326, October.
    5. Don Husereau & Michael Drummond & Stavros Petrou & Chris Carswell & David Moher & Dan Greenberg & Federico Augustovski & Andrew Briggs & Josephine Mauskopf & Elizabeth Loder, 2013. "Consolidated Health Economic Evaluation Reporting Standards (CHEERS) Statement," PharmacoEconomics, Springer, vol. 31(5), pages 361-367, May.
    6. Hoces de la Guardia, Fernando, 2017. "How Transparency and Reproducibility Can Increase Credibility in Policy Analysis: A Case Study of the Minimum Wage Policy Estimate," MetaArXiv ba7tr, Center for Open Science.
    7. Eva Vivalt, 0. "How Much Can We Generalize From Impact Evaluations?," Journal of the European Economic Association, European Economic Association, vol. 18(6), pages 3045-3089.
    8. Eva Vivalt, 2020. "How Much Can We Generalize From Impact Evaluations?," Journal of the European Economic Association, European Economic Association, vol. 18(6), pages 3045-3089.
    9. Manski, Charles F., 2013. "Public Policy in an Uncertain World: Analysis and Decisions," Economics Books, Harvard University Press, number 9780674066892, march.
    10. Pfenninger, Stefan & DeCarolis, Joseph & Hirth, Lion & Quoilin, Sylvain & Staffell, Iain, 2017. "The importance of open data and software: Is energy research lagging behind?," Energy Policy, Elsevier, vol. 101(C), pages 211-215.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. repec:cdl:econwp:qt7fc7s8cd is not listed on IDEAS
    2. Khaja Kamaluddin, 2022. "Security Policy Enforcement and Behavioral Threat Detection in DevSecOps Pipelines," European Journal of Technology, AJPO Journals Limited, vol. 6(4), pages 10-30.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Alex Eble & Peter Boone & Diana Elbourne, 2017. "On Minimizing the Risk of Bias in Randomized Controlled Trials in Economics," The World Bank Economic Review, World Bank, vol. 31(3), pages 687-707.
    2. repec:cdl:econwp:qt05r470xk is not listed on IDEAS
    3. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    4. Stefano DellaVigna & Elizabeth Linos, 2022. "RCTs to Scale: Comprehensive Evidence From Two Nudge Units," Econometrica, Econometric Society, vol. 90(1), pages 81-116, January.
    5. Susan Athey & Raj Chetty & Guido Imbens, 2020. "Using Experiments to Correct for Selection in Observational Studies," Papers 2006.09676, arXiv.org, revised May 2025.
    6. Marcel Fafchamps & Julien Labonne, 2016. "Using Split Samples to Improve Inference about Causal Effects," NBER Working Papers 21842, National Bureau of Economic Research, Inc.
    7. Guido W. Imbens, 2020. "Potential Outcome and Directed Acyclic Graph Approaches to Causality: Relevance for Empirical Practice in Economics," Journal of Economic Literature, American Economic Association, vol. 58(4), pages 1129-1179, December.
    8. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    9. Brodeur, Abel & Cook, Nikolai & Heyes, Anthony, 2018. "Methods Matter: P-Hacking and Causal Inference in Economics," IZA Discussion Papers 11796, Institute of Labor Economics (IZA).
    10. Stephan B. Bruns, 2016. "The Fragility of Meta-Regression Models in Observational Research," MAGKS Papers on Economics 201603, Philipps-Universität Marburg, Faculty of Business Administration and Economics, Department of Economics (Volkswirtschaftliche Abteilung).
    11. Esterling, Kevin & Brady, David & Schwitzgebel, Eric, 2021. "The Necessity of Construct and External Validity for Generalized Causal Claims," OSF Preprints 2s8w5, Center for Open Science.
    12. Ankel-Peters, Jörg & Fiala, Nathan & Neubauer, Florian, 2023. "Do economists replicate?," Journal of Economic Behavior & Organization, Elsevier, vol. 212(C), pages 219-232.
    13. Garret Christensen & Edward Miguel, 2018. "Transparency, Reproducibility, and the Credibility of Economics Research," Journal of Economic Literature, American Economic Association, vol. 56(3), pages 920-980, September.
    14. Dionissi Aliprantis, 2017. "Assessing the evidence on neighborhood effects from Moving to Opportunity," Empirical Economics, Springer, vol. 52(3), pages 925-954, May.
    15. Benno Torgler, 2022. "The power of public choice in law and economics," Journal of Economic Surveys, Wiley Blackwell, vol. 36(5), pages 1410-1453, December.
    16. Deaton, Angus & Cartwright, Nancy, 2018. "Understanding and misunderstanding randomized controlled trials," Social Science & Medicine, Elsevier, vol. 210(C), pages 2-21.
    17. Abel Brodeur & Nikolai Cook & Anthony Heyes, 2020. "Methods Matter: p-Hacking and Publication Bias in Causal Analysis in Economics," American Economic Review, American Economic Association, vol. 110(11), pages 3634-3660, November.
    18. Schmidt Christoph M., 2014. "Wirkungstreffer erzielen – Die Rolle der evidenzbasierten Politikberatung in einer aufgeklärten Gesellschaft," Perspektiven der Wirtschaftspolitik, De Gruyter, vol. 15(3), pages 219-233, October.
    19. Burt S. Barnow & Jeffrey Smith, 2015. "Employment and Training Programs," NBER Chapters, in: Economics of Means-Tested Transfer Programs in the United States, Volume 2, pages 127-234, National Bureau of Economic Research, Inc.
    20. Alberto Abadie & Susan Athey & Guido W. Imbens & Jeffrey M. Wooldridge, 2020. "Sampling‐Based versus Design‐Based Uncertainty in Regression Analysis," Econometrica, Econometric Society, vol. 88(1), pages 265-296, January.
    21. Charles F. Manski & John V. Pepper, 2018. "How Do Right-to-Carry Laws Affect Crime Rates? Coping with Ambiguity Using Bounded-Variation Assumptions," The Review of Economics and Statistics, MIT Press, vol. 100(2), pages 232-244, May.

    More about this item

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:osf:metaar:jnyqh. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: OSF (email available below). General contact details of provider: https://osf.io/preprints/metaarxiv .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.