IDEAS home Printed from https://ideas.repec.org/p/feb/artefa/00679.html
   My bibliography  Save this paper

How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling

Author

Listed:
  • Omar Al-Ubaydli
  • John List
  • Claire Mackevicius
  • Min Sok Lee
  • Dana Suskind

Abstract

Policymakers are increasingly turning to insights gained from the experimental method as a means to inform large scale public policies. Critics view this increased usage as premature, pointing to the fact that many experimentally-tested programs fail to deliver their promise at scale. Under this view, the experimental approach drives too much public policy. Yet, if policymakers could be more confident that the original research findings would be delivered at scale, even the staunchest critics would carve out a larger role for experiments to inform policy. Leveraging the economic framework of Al-Ubaydli et al. (2019), we put forward 12 simple proposals, spanning researchers, policymakers, funders, and stakeholders, which together tackle the most vexing scalability threats. The framework highlights that only after we deepen our understanding of the scale up problem will we be on solid ground to argue that scientific experiments should hold a more prominent place in the policymaker's quiver.

Suggested Citation

  • Omar Al-Ubaydli & John List & Claire Mackevicius & Min Sok Lee & Dana Suskind, 2019. "How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling," Artefactual Field Experiments 00679, The Field Experiments Website.
  • Handle: RePEc:feb:artefa:00679
    as

    Download full text from publisher

    File URL: http://s3.amazonaws.com/fieldexperiments-papers2/papers/00679.pdf
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Bruno Crépon & Esther Duflo & Marc Gurgand & Roland Rathelot & Philippe Zamora, 2013. "Do Labor Market Policies have Displacement Effects? Evidence from a Clustered Randomized Experiment," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 128(2), pages 531-580.
    2. Abhijit Banerjee & Sharon Barnhardt & Esther Duflo, 2015. "Movies, Margins, and Marketing: Encouraging the Adoption of Iron-Fortified Salt," NBER Chapters, in: Insights in the Economics of Aging, pages 285-306, National Bureau of Economic Research, Inc.
    3. John A. List & Azeem M. Shaikh & Yang Xu, 2019. "Multiple hypothesis testing in experimental economics," Experimental Economics, Springer;Economic Science Association, vol. 22(4), pages 773-793, December.
    4. Abhijit Banerjee & Rukmini Banerji & James Berry & Esther Duflo & Harini Kannan & Shobhini Mukerji & Marc Shotland & Michael Walton, 2017. "From Proof of Concept to Scalable Policies: Challenges and Solutions, with an Application," Journal of Economic Perspectives, American Economic Association, vol. 31(4), pages 73-102, Fall.
    5. James J. Heckman, 2010. "Building Bridges between Structural and Program Evaluation Approaches to Evaluating Policy," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 356-398, June.
    6. John A. List, 2007. "On the Interpretation of Giving in Dictator Games," Journal of Political Economy, University of Chicago Press, vol. 115(3), pages 482-493.
    7. Jason T. Kerwin & Rebecca L. Thornton, 2021. "Making the Grade: The Sensitivity of Education Program Effectiveness to Input Choices and Outcome Measures," The Review of Economics and Statistics, MIT Press, vol. 103(2), pages 251-264, May.
    8. John Horton & David Rand & Richard Zeckhauser, 2011. "The online laboratory: conducting experiments in a real labor market," Experimental Economics, Springer;Economic Science Association, vol. 14(3), pages 399-425, September.
    9. repec:hal:pseose:halshs-00840901 is not listed on IDEAS
    10. repec:mpr:mprres:7680 is not listed on IDEAS
    11. Mike Gilraine & Hugh Macartney & Rob McMillan, 2018. "Education Reform in General Equilibrium: Evidence from California's Class Size Reduction," Working Papers tecipa-594, University of Toronto, Department of Economics.
    12. John A. List, 2006. "The Behavioralist Meets the Market: Measuring Social Preferences and Reputation Effects in Actual Transactions," Journal of Political Economy, University of Chicago Press, vol. 114(1), pages 1-37, February.
    13. Hunt Allcott, 2015. "Site Selection Bias in Program Evaluation," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 130(3), pages 1117-1165.
    14. Patrick Kline & Christopher R. Walters, 2016. "Evaluating Public Programs with Close Substitutes: The Case of HeadStart," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 131(4), pages 1795-1848.
    15. Omar Al-Ubaydli & John A. List & Dana Suskind, 2019. "The Science of Using Science: Towards an Understanding of the Threats to Scaling Experiments," NBER Working Papers 25848, National Bureau of Economic Research, Inc.
    16. Eva Vivalt, 0. "How Much Can We Generalize From Impact Evaluations?," Journal of the European Economic Association, European Economic Association, vol. 18(6), pages 3045-3089.
    17. Dean Karlan & John A. List, 2007. "Does Price Matter in Charitable Giving? Evidence from a Large-Scale Natural Field Experiment," American Economic Review, American Economic Association, vol. 97(5), pages 1774-1793, December.
    18. Vernon L. Smith, 1962. "An Experimental Study of Competitive Market Behavior," Journal of Political Economy, University of Chicago Press, vol. 70(3), pages 322-322.
    19. Anonymous, 2013. "Introduction to the Issue," Journal of Wine Economics, Cambridge University Press, vol. 8(2), pages 129-130, November.
    20. John A. List, 2011. "The Market for Charitable Giving," Journal of Economic Perspectives, American Economic Association, vol. 25(2), pages 157-180, Spring.
    21. Agha Ali Akram & Shyamal Chowdhury & Ahmed Mushfiq Mobarak, 2017. "Effects of Emigration on Rural Labor Markets," NBER Working Papers 23929, National Bureau of Economic Research, Inc.
    22. Deaton, Angus & Cartwright, Nancy, 2018. "Understanding and misunderstanding randomized controlled trials," Social Science & Medicine, Elsevier, vol. 210(C), pages 2-21.
    23. Joshua D. Angrist & Susan M. Dynarski & Thomas J. Kane & Parag A. Pathak & Christopher R. Walters, 2012. "Who Benefits from KIPP?," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 31(4), pages 837-860, September.
    24. Luigi Butera & John List, 2017. "An Economic Approach to Alleviate the Crisis of Confidence in Science: With an Application to the Public Goods Game," Artefactual Field Experiments 00608, The Field Experiments Website.
    25. Roland G. Fryer, Jr. & Steven D. Levitt & John A. List, 2015. "Parental Incentives and Early Childhood Achievement: A Field Experiment in Chicago Heights," NBER Working Papers 21477, National Bureau of Economic Research, Inc.
    26. Karthik Muralidharan & Paul Niehaus, 2017. "Experimentation at Scale," Journal of Economic Perspectives, American Economic Association, vol. 31(4), pages 103-124, Fall.
    27. Abhijit Banerjee & Esther Duflo & Clément Imbert & Santhosh Mathew & Rohini Pande, 2020. "E-governance, Accountability, and Leakage in Public Programs: Experimental Evidence from a Financial Management Reform in India," American Economic Journal: Applied Economics, American Economic Association, vol. 12(4), pages 39-72, October.
    28. Heckman, James J. & Lalonde, Robert J. & Smith, Jeffrey A., 1999. "The economics and econometrics of active labor market programs," Handbook of Labor Economics, in: O. Ashenfelter & D. Card (ed.), Handbook of Labor Economics, edition 1, volume 3, chapter 31, pages 1865-2097, Elsevier.
    29. Bernhard Voelkl & Lucile Vogt & Emily S Sena & Hanno Würbel, 2018. "Reproducibility of preclinical animal research improves with heterogeneity of study samples," PLOS Biology, Public Library of Science, vol. 16(2), pages 1-13, February.
    30. Gani Aldashev & Georg Kirchsteiger & Alexander Sebald, 2017. "Assignment Procedure Biases in Randomised Policy Experiments," Economic Journal, Royal Economic Society, vol. 127(602), pages 873-895, June.
    31. Christopher Jepsen & Steven Rivkin, 2009. "Class Size Reduction and Student Achievement: The Potential Tradeoff between Teacher Quality and Class Size," Journal of Human Resources, University of Wisconsin Press, vol. 44(1).
    32. Isaiah Andrews & Maximilian Kasy, 2019. "Identification of and Correction for Publication Bias," American Economic Review, American Economic Association, vol. 109(8), pages 2766-2794, August.
    33. Michael Gilraine & Hugh Macartney & Robert McMillan, 2018. "Estimating the Direct and Indirect Effects of Major Education Reforms," NBER Working Papers 24191, National Bureau of Economic Research, Inc.
    34. Nava Ashraf & Oriana Bandiera & Edward Davenport & Scott S. Lee, 2020. "Losing Prosociality in the Quest for Talent? Sorting, Selection, and Productivity in the Delivery of Public Services," American Economic Review, American Economic Association, vol. 110(5), pages 1355-1394, May.
    35. John A. List, 2004. "Neoclassical Theory Versus Prospect Theory: Evidence from the Marketplace," Econometrica, Econometric Society, vol. 72(2), pages 615-625, March.
    36. Gani Aldashev & Georg Kirchsteiger & Alexander Sebald, 2017. "Assignment Procedure Biases in Randomised Policy Experiments," Economic Journal, Royal Economic Society, vol. 127(602), pages 873-895, June.
    37. repec:mpr:mprres:7681 is not listed on IDEAS
    38. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    39. James Heckman & Lance Lochner & Christopher Taber, 1998. "Explaining Rising Wage Inequality: Explanations With A Dynamic General Equilibrium Model of Labor Earnings With Heterogeneous Agents," Review of Economic Dynamics, Elsevier for the Society for Economic Dynamics, vol. 1(1), pages 1-58, January.
    40. Eva Vivalt, 2020. "How Much Can We Generalize From Impact Evaluations?," Journal of the European Economic Association, European Economic Association, vol. 18(6), pages 3045-3089.
    41. Anonymous, 2013. "Introduction to the Issue," Journal of Wine Economics, Cambridge University Press, vol. 8(3), pages 243-243, December.
    42. Yuyu Chen & David Y. Yang, 2019. "The Impact of Media Censorship: 1984 or Brave New World?," American Economic Review, American Economic Association, vol. 109(6), pages 2294-2332, June.
    43. John A. List, 2011. "Why Economists Should Conduct Field Experiments and 14 Tips for Pulling One Off," Journal of Economic Perspectives, American Economic Association, vol. 25(3), pages 3-16, Summer.
    44. Edward Miguel & Michael Kremer, 2004. "Worms: Identifying Impacts on Education and Health in the Presence of Treatment Externalities," Econometrica, Econometric Society, vol. 72(1), pages 159-217, January.
    45. Omar Al-Ubaydli & John A. List & Danielle LoRe & Dana Suskind, 2017. "Scaling for Economists: Lessons from the Non-Adherence Problem in the Medical Literature," Journal of Economic Perspectives, American Economic Association, vol. 31(4), pages 125-144, Fall.
    46. Omar Al-Ubaydli & John A. List & Dana L. Suskind, 2017. "What Can We Learn from Experiments? Understanding the Threats to the Scalability of Experimental Results," American Economic Review, American Economic Association, vol. 107(5), pages 282-286, May.
    47. Philip Davies, 2012. "The State of Evidence-Based Policy Evaluation and its Role in Policy Formation," National Institute Economic Review, National Institute of Economic and Social Research, vol. 219(1), pages 41-52, January.
    48. James Heckman & Hidehiko Ichimura & Jeffrey Smith & Petra Todd, 1998. "Characterizing Selection Bias Using Experimental Data," Econometrica, Econometric Society, vol. 66(5), pages 1017-1098, September.
    49. Karthik Muralidharan & Venkatesh Sundararaman, 2015. "Editor's Choice The Aggregate Effect of School Choice: Evidence from a Two-Stage Experiment in India," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 130(3), pages 1011-1066.
    50. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    51. Abhijit Banerjee & Dean Karlan & Jonathan Zinman, 2015. "Six Randomized Evaluations of Microcredit: Introduction and Further Steps," American Economic Journal: Applied Economics, American Economic Association, vol. 7(1), pages 1-21, January.
    52. repec:feb:artefa:0110 is not listed on IDEAS
    53. Fiorina, Morris P. & Plott, Charles R., 1978. "Committee Decisions under Majority Rule: An Experimental Study," American Political Science Review, Cambridge University Press, vol. 72(2), pages 575-598, June.
    54. Neal S Young, 2008. "Why Current Publication May Distort Science," Working Papers id:1757, eSocialSciences.
    55. Neal S Young & John P A Ioannidis & Omar Al-Ubaydli, 2008. "Why Current Publication Practices May Distort Science," PLOS Medicine, Public Library of Science, vol. 5(10), pages 1-5, October.
    56. Steven D. Levitt & John A. List, 2007. "What Do Laboratory Experiments Measuring Social Preferences Reveal About the Real World?," Journal of Economic Perspectives, American Economic Association, vol. 21(2), pages 153-174, Spring.
    57. Zacharias Maniadis & Fabio Tufano & John A. List, 2014. "One Swallow Doesn't Make a Summer: New Evidence on Anchoring Effects," American Economic Review, American Economic Association, vol. 104(1), pages 277-290, January.
    58. Christina Clark Tuttle & Philip Gleason & Virginia Knechtel & Ira Nichols-Barrer & Kevin Booker & Gregory Chojnacki & Thomas Coen & Lisbeth Goble, "undated". "Understanding the Effect of KIPP as it Scales: Volume I, Impacts on Achievement and Other Outcomes (Executive Summary)," Mathematica Policy Research Reports 29bd80d4c7bf491f84e9e7d37, Mathematica Policy Research.
    59. Jonathan M.V. Davis & Jonathan Guryan & Kelly Hallberg & Jens Ludwig, 2017. "The Economics of Scale-Up," NBER Working Papers 23925, National Bureau of Economic Research, Inc.
    60. Richard A. Bettis, 2012. "The search for asterisks: Compromised statistical tests and flawed theories," Strategic Management Journal, Wiley Blackwell, vol. 33(1), pages 108-113, January.
    61. Hunter, John E, 2001. "The Desperate Need for Replications," Journal of Consumer Research, Journal of Consumer Research Inc., vol. 28(1), pages 149-158, June.
    Full references (including those not matched with items on IDEAS)

    Citations

    Blog mentions

    As found by EconAcademics.org, the blog aggregator for Economics research:
    1. Policymaking Is Not a Science (Yet) (Ep. 405 Rebroadcast)
      by Stephen J. Dubner in Freakonomics on 2021-03-25 03:00:27
    2. Policymaking Is Not a Science (Yet) (Ep. 405)
      by Stephen J. Dubner in Freakonomics on 2020-02-13 04:00:34

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Aars, Ole Kristian & Godager, Geir & Kaarboe, Oddvar & Moger, Tron Anders, 2022. "Sending emails to reduce medical costs? The effect of feedback on general practitioners’ claiming of fees," HERO Online Working Paper Series 2022:1, University of Oslo, Health Economics Research Programme.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    2. Omar Al-Ubaydli & John List & Dana Suskind, 2019. "The science of using science: Towards an understanding of the threats to scaling experiments," Artefactual Field Experiments 00670, The Field Experiments Website.
    3. John A. List, 2024. "Optimally generate policy-based evidence before scaling," Nature, Nature, vol. 626(7999), pages 491-499, February.
    4. Abhijit Banerjee & Rukmini Banerji & James Berry & Esther Duflo & Harini Kannan & Shobhini Mukerji & Marc Shotland & Michael Walton, 2017. "From Proof of Concept to Scalable Policies: Challenges and Solutions, with an Application," Journal of Economic Perspectives, American Economic Association, vol. 31(4), pages 73-102, Fall.
    5. John List, 2021. "2021 Summary Data of Artefactual Field Experiments Published on Fieldexperiments.com," Artefactual Field Experiments 00749, The Field Experiments Website.
    6. John List, 2022. "2021 Summary Data of Natural Field Experiments Published on Fieldexperiments.com," Natural Field Experiments 00747, The Field Experiments Website.
    7. John List, 2022. "Framed Field Experiments: 2021 Summary on Fieldexperiments.com," Framed Field Experiments 00752, The Field Experiments Website.
    8. Omar Al-Ubaydli & John List, 2013. "On the Generalizability of Experimental Results in Economics: With A Response To Camerer," Artefactual Field Experiments j0001, The Field Experiments Website.
    9. Jörg Peters & Jörg Langbein & Gareth Roberts, 2018. "Generalization in the Tropics – Development Policy, Randomized Controlled Trials, and External Validity," The World Bank Research Observer, World Bank, vol. 33(1), pages 34-64.
    10. Omar Al-Ubaydli & John A. List, 2013. "On the Generalizability of Experimental Results in Economics: With a Response to Commentors," CESifo Working Paper Series 4543, CESifo.
    11. Cristina Corduneanu-Huci & Michael T. Dorsch & Paul Maarek, 2017. "Learning to constrain: Political competition and randomized controlled trials in development," THEMA Working Papers 2017-24, THEMA (THéorie Economique, Modélisation et Applications), Université de Cergy-Pontoise.
    12. John A. List & Fatemeh Momeni & Yves Zenou, 2020. "The Social Side of Early Human Capital Formation: Using a Field Experiment to Estimate the Causal Impact of Neighborhoods," Working Papers 2020-187, Becker Friedman Institute for Research In Economics.
    13. Hallsworth, Michael & List, John A. & Metcalfe, Robert D. & Vlaev, Ivo, 2017. "The behavioralist as tax collector: Using natural field experiments to enhance tax compliance," Journal of Public Economics, Elsevier, vol. 148(C), pages 14-31.
    14. Justman, Moshe, 2018. "Randomized controlled trials informing public policy: Lessons from project STAR and class size reduction," European Journal of Political Economy, Elsevier, vol. 54(C), pages 167-174.
    15. Omar Al-Ubaydli & John List, 2016. "Field Experiments in Markets," Artefactual Field Experiments j0002, The Field Experiments Website.
    16. Agostinelli, Francesco & Avitabile, Ciro & Bobba, Matteo, 2021. "Enhancing Human Capital in Children: A Case Study on Scaling," TSE Working Papers 21-1196, Toulouse School of Economics (TSE), revised Oct 2023.
    17. Matteo M. Galizzi & Daniel Navarro-Martinez, 2019. "On the External Validity of Social Preference Games: A Systematic Lab-Field Study," Management Science, INFORMS, vol. 65(3), pages 976-1002, March.
    18. Mariella Gonzales & Gianmarco León-Ciliotta & Luis R. Martínez, 2022. "How Effective Are Monetary Incentives to Vote? Evidence from a Nationwide Policy," American Economic Journal: Applied Economics, American Economic Association, vol. 14(1), pages 293-326, January.
    19. Jason T. Kerwin & Rebecca L. Thornton, 2021. "Making the Grade: The Sensitivity of Education Program Effectiveness to Input Choices and Outcome Measures," The Review of Economics and Statistics, MIT Press, vol. 103(2), pages 251-264, May.
    20. Omar Al-Ubaydli & John A. List & Dana L. Suskind, 2017. "What Can We Learn from Experiments? Understanding the Threats to the Scalability of Experimental Results," American Economic Review, American Economic Association, vol. 107(5), pages 282-286, May.

    More about this item

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:feb:artefa:00679. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: David Franks (email available below). General contact details of provider: http://www.fieldexperiments.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.