IDEAS home Printed from https://ideas.repec.org/p/feb/artefa/00679.html
   My bibliography  Save this paper

How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling

Author

Listed:
  • Omar Al-Ubaydli
  • John List
  • Claire Mackevicius
  • Min Sok Lee
  • Dana Suskind

Abstract

Policymakers are increasingly turning to insights gained from the experimental method as a means to inform large scale public policies. Critics view this increased usage as premature, pointing to the fact that many experimentally-tested programs fail to deliver their promise at scale. Under this view, the experimental approach drives too much public policy. Yet, if policymakers could be more confident that the original research findings would be delivered at scale, even the staunchest critics would carve out a larger role for experiments to inform policy. Leveraging the economic framework of Al-Ubaydli et al. (2019), we put forward 12 simple proposals, spanning researchers, policymakers, funders, and stakeholders, which together tackle the most vexing scalability threats. The framework highlights that only after we deepen our understanding of the scale up problem will we be on solid ground to argue that scientific experiments should hold a more prominent place in the policymaker's quiver.

Suggested Citation

  • Omar Al-Ubaydli & John List & Claire Mackevicius & Min Sok Lee & Dana Suskind, 2019. "How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling," Artefactual Field Experiments 00679, The Field Experiments Website.
  • Handle: RePEc:feb:artefa:00679
    as

    Download full text from publisher

    File URL: http://s3.amazonaws.com/fieldexperiments-papers2/papers/00679.pdf
    Download Restriction: no

    References listed on IDEAS

    as
    1. Abhijit Banerjee & Sharon Barnhardt & Esther Duflo, 2015. "Movies, Margins, and Marketing: Encouraging the Adoption of Iron-Fortified Salt," NBER Chapters, in: Insights in the Economics of Aging, pages 285-306, National Bureau of Economic Research, Inc.
    2. Abhijit Banerjee & Rukmini Banerji & James Berry & Esther Duflo & Harini Kannan & Shobhini Mukerji & Marc Shotland & Michael Walton, 2017. "From Proof of Concept to Scalable Policies: Challenges and Solutions, with an Application," Journal of Economic Perspectives, American Economic Association, vol. 31(4), pages 73-102, Fall.
    3. John Horton & David Rand & Richard Zeckhauser, 2011. "The online laboratory: conducting experiments in a real labor market," Experimental Economics, Springer;Economic Science Association, vol. 14(3), pages 399-425, September.
    4. Bruno Crépon & Esther Duflo & Marc Gurgand & Roland Rathelot & Philippe Zamora, 2013. "Do Labor Market Policies have Displacement Effects? Evidence from a Clustered Randomized Experiment," The Quarterly Journal of Economics, Oxford University Press, vol. 128(2), pages 531-580.
    5. Patrick Kline & Christopher R. Walters, 2016. "Evaluating Public Programs with Close Substitutes: The Case of HeadStart," The Quarterly Journal of Economics, Oxford University Press, vol. 131(4), pages 1795-1848.
    6. Dean Karlan & John A. List, 2007. "Does Price Matter in Charitable Giving? Evidence from a Large-Scale Natural Field Experiment," American Economic Review, American Economic Association, vol. 97(5), pages 1774-1793, December.
    7. Joshua D. Angrist & Susan M. Dynarski & Thomas J. Kane & Parag A. Pathak & Christopher R. Walters, 2012. "Who Benefits from KIPP?," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 31(4), pages 837-860, September.
    8. Michael Gilraine & Hugh Macartney & Robert McMillan, 2018. "Education Reform in General Equilibrium: Evidence from California's Class Size Reduction," NBER Working Papers 24191, National Bureau of Economic Research, Inc.
    9. Isaiah Andrews & Maximilian Kasy, 2019. "Identification of and Correction for Publication Bias," American Economic Review, American Economic Association, vol. 109(8), pages 2766-2794, August.
    10. James Heckman & Lance Lochner & Christopher Taber, 1998. "Explaining Rising Wage Inequality: Explanations With A Dynamic General Equilibrium Model of Labor Earnings With Heterogeneous Agents," Review of Economic Dynamics, Elsevier for the Society for Economic Dynamics, vol. 1(1), pages 1-58, January.
    11. Edward Miguel & Michael Kremer, 2004. "Worms: Identifying Impacts on Education and Health in the Presence of Treatment Externalities," Econometrica, Econometric Society, vol. 72(1), pages 159-217, January.
    12. Omar Al-Ubaydli & John A. List & Dana L. Suskind, 2017. "What Can We Learn from Experiments? Understanding the Threats to the Scalability of Experimental Results," American Economic Review, American Economic Association, vol. 107(5), pages 282-286, May.
    13. Philip Davies, 2012. "The State of Evidence-Based Policy Evaluation and its Role in Policy Formation," National Institute Economic Review, National Institute of Economic and Social Research, vol. 219(1), pages 41-52, January.
    14. James Heckman & Hidehiko Ichimura & Jeffrey Smith & Petra Todd, 1998. "Characterizing Selection Bias Using Experimental Data," Econometrica, Econometric Society, vol. 66(5), pages 1017-1098, September.
    15. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    16. Deaton, Angus & Cartwright, Nancy, 2018. "Understanding and misunderstanding randomized controlled trials," Social Science & Medicine, Elsevier, vol. 210(C), pages 2-21.
    17. James J. Heckman, 2010. "Building Bridges between Structural and Program Evaluation Approaches to Evaluating Policy," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 356-398, June.
    18. Abhijit Banerjee & Dean Karlan & Jonathan Zinman, 2015. "Six Randomized Evaluations of Microcredit: Introduction and Further Steps," American Economic Journal: Applied Economics, American Economic Association, vol. 7(1), pages 1-21, January.
    19. John A. List & Azeem M. Shaikh & Yang Xu, 2019. "Multiple hypothesis testing in experimental economics," Experimental Economics, Springer;Economic Science Association, vol. 22(4), pages 773-793, December.
    20. repec:feb:artefa:0110 is not listed on IDEAS
    21. John A. List, 2011. "Why Economists Should Conduct Field Experiments and 14 Tips for Pulling One Off," Journal of Economic Perspectives, American Economic Association, vol. 25(3), pages 3-16, Summer.
    22. repec:cup:apsrev:v:72:y:1978:i:02:p:575-598_15 is not listed on IDEAS
    23. Steven D. Levitt & John A. List, 2007. "What Do Laboratory Experiments Measuring Social Preferences Reveal About the Real World?," Journal of Economic Perspectives, American Economic Association, vol. 21(2), pages 153-174, Spring.
    24. Zacharias Maniadis & Fabio Tufano & John A. List, 2014. "One Swallow Doesn't Make a Summer: New Evidence on Anchoring Effects," American Economic Review, American Economic Association, vol. 104(1), pages 277-290, January.
    25. John A. List, 2004. "Neoclassical Theory Versus Prospect Theory: Evidence from the Marketplace," Econometrica, Econometric Society, vol. 72(2), pages 615-625, March.
    26. repec:bla:stratm:v:33:y:2012:i:1:p:108-113 is not listed on IDEAS
    27. Mike Gilraine & Hugh Macartney & Rob McMillan, 2018. "Education Reform in General Equilibrium: Evidence from California’s Class Size Reduction," Working Papers tecipa-594, University of Toronto, Department of Economics.
    28. John A. List, 2006. "The Behavioralist Meets the Market: Measuring Social Preferences and Reputation Effects in Actual Transactions," Journal of Political Economy, University of Chicago Press, vol. 114(1), pages 1-37, February.
    29. repec:aea:jecper:v:31:y:2017:i:4:p:103-24 is not listed on IDEAS
    30. repec:wly:econjl:v:127:y:2017:i:602:p:873-895 is not listed on IDEAS
    31. Christopher Jepsen & Steven Rivkin, 2009. "Class Size Reduction and Student Achievement: The Potential Tradeoff between Teacher Quality and Class Size," Journal of Human Resources, University of Wisconsin Press, vol. 44(1).
    32. Gani Aldashev & Georg Kirchsteiger & Alexander Sebald, 2017. "Assignment Procedure Biases in Randomised Policy Experiments," Economic Journal, Royal Economic Society, vol. 127(602), pages 873-895, June.
    33. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    34. Karthik Muralidharan & Paul Niehaus, 2017. "Experimentation at Scale," Journal of Economic Perspectives, American Economic Association, vol. 31(4), pages 103-124, Fall.
    35. repec:aea:aecrev:v:109:y:2019:i:6:p:2294-2332 is not listed on IDEAS
    36. Omar Al-Ubaydli & John A. List & Danielle LoRe & Dana Suskind, 2017. "Scaling for Economists: Lessons from the Non-Adherence Problem in the Medical Literature," Journal of Economic Perspectives, American Economic Association, vol. 31(4), pages 125-144, Fall.
    37. Karthik Muralidharan & Venkatesh Sundararaman, 2015. "Editor's Choice The Aggregate Effect of School Choice: Evidence from a Two-Stage Experiment in India," The Quarterly Journal of Economics, Oxford University Press, vol. 130(3), pages 1011-1066.
    38. Nava Ashraf & Oriana Bandiera & Scott Lee, 2014. "Losing Prosociality in the Quest for Talent? Sorting, Selection, and Productivity in the Delivery of Public Services," STICERD - Economic Organisation and Public Policy Discussion Papers Series 065, Suntory and Toyota International Centres for Economics and Related Disciplines, LSE.
    39. Hunter, John E, 2001. " The Desperate Need for Replications," Journal of Consumer Research, Oxford University Press, vol. 28(1), pages 149-158, June.
    Full references (including those not matched with items on IDEAS)

    More about this item

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:feb:artefa:00679. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Joe Seidel). General contact details of provider: http://www.fieldexperiments.com .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.