IDEAS home Printed from https://ideas.repec.org/p/feb/artefa/00679.html
   My bibliography  Save this paper

How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling

Author

Listed:
  • Omar Al-Ubaydli
  • John List
  • Claire Mackevicius
  • Min Sok Lee
  • Dana Suskind

Abstract

Policymakers are increasingly turning to insights gained from the experimental method as a means to inform large scale public policies. Critics view this increased usage as premature, pointing to the fact that many experimentally-tested programs fail to deliver their promise at scale. Under this view, the experimental approach drives too much public policy. Yet, if policymakers could be more confident that the original research findings would be delivered at scale, even the staunchest critics would carve out a larger role for experiments to inform policy. Leveraging the economic framework of Al-Ubaydli et al. (2019), we put forward 12 simple proposals, spanning researchers, policymakers, funders, and stakeholders, which together tackle the most vexing scalability threats. The framework highlights that only after we deepen our understanding of the scale up problem will we be on solid ground to argue that scientific experiments should hold a more prominent place in the policymaker's quiver.

Suggested Citation

  • Omar Al-Ubaydli & John List & Claire Mackevicius & Min Sok Lee & Dana Suskind, 2019. "How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling," Artefactual Field Experiments 00679, The Field Experiments Website.
  • Handle: RePEc:feb:artefa:00679
    as

    Download full text from publisher

    File URL: http://s3.amazonaws.com/fieldexperiments-papers2/papers/00679.pdf
    Download Restriction: no

    References listed on IDEAS

    as
    1. Bruno Crépon & Esther Duflo & Marc Gurgand & Roland Rathelot & Philippe Zamora, 2013. "Do Labor Market Policies have Displacement Effects? Evidence from a Clustered Randomized Experiment," The Quarterly Journal of Economics, Oxford University Press, vol. 128(2), pages 531-580.
    2. John Horton & David Rand & Richard Zeckhauser, 2011. "The online laboratory: conducting experiments in a real labor market," Experimental Economics, Springer;Economic Science Association, vol. 14(3), pages 399-425, September.
    3. Abhijit Banerjee & Sharon Barnhardt & Esther Duflo, 2015. "Movies, Margins, and Marketing: Encouraging the Adoption of Iron-Fortified Salt," NBER Chapters, in: Insights in the Economics of Aging, pages 285-306, National Bureau of Economic Research, Inc.
    4. John A. List & Azeem M. Shaikh & Yang Xu, 2019. "Multiple hypothesis testing in experimental economics," Experimental Economics, Springer;Economic Science Association, vol. 22(4), pages 773-793, December.
    5. Abhijit Banerjee & Rukmini Banerji & James Berry & Esther Duflo & Harini Kannan & Shobhini Mukerji & Marc Shotland & Michael Walton, 2017. "From Proof of Concept to Scalable Policies: Challenges and Solutions, with an Application," Journal of Economic Perspectives, American Economic Association, vol. 31(4), pages 73-102, Fall.
    6. James J. Heckman, 2010. "Building Bridges between Structural and Program Evaluation Approaches to Evaluating Policy," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 356-398, June.
    7. Patrick Kline & Christopher R. Walters, 2016. "Evaluating Public Programs with Close Substitutes: The Case of HeadStart," The Quarterly Journal of Economics, Oxford University Press, vol. 131(4), pages 1795-1848.
    8. James Heckman & Lance Lochner & Christopher Taber, 1998. "Explaining Rising Wage Inequality: Explanations With A Dynamic General Equilibrium Model of Labor Earnings With Heterogeneous Agents," Review of Economic Dynamics, Elsevier for the Society for Economic Dynamics, vol. 1(1), pages 1-58, January.
    9. Mike Gilraine & Hugh Macartney & Rob McMillan, 2018. "Education Reform in General Equilibrium: Evidence from California’s Class Size Reduction," Working Papers tecipa-594, University of Toronto, Department of Economics.
    10. John A. List, 2006. "The Behavioralist Meets the Market: Measuring Social Preferences and Reputation Effects in Actual Transactions," Journal of Political Economy, University of Chicago Press, vol. 114(1), pages 1-37, February.
    11. Karthik Muralidharan & Paul Niehaus, 2017. "Experimentation at Scale," Journal of Economic Perspectives, American Economic Association, vol. 31(4), pages 103-124, Fall.
    12. Yuyu Chen & David Y. Yang, 2019. "The Impact of Media Censorship: 1984 or Brave New World?," American Economic Review, American Economic Association, vol. 109(6), pages 2294-2332, June.
    13. Dean Karlan & John A. List, 2007. "Does Price Matter in Charitable Giving? Evidence from a Large-Scale Natural Field Experiment," American Economic Review, American Economic Association, vol. 97(5), pages 1774-1793, December.
    14. John A. List, 2011. "Why Economists Should Conduct Field Experiments and 14 Tips for Pulling One Off," Journal of Economic Perspectives, American Economic Association, vol. 25(3), pages 3-16, Summer.
    15. Joshua D. Angrist & Susan M. Dynarski & Thomas J. Kane & Parag A. Pathak & Christopher R. Walters, 2012. "Who Benefits from KIPP?," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 31(4), pages 837-860, September.
    16. Edward Miguel & Michael Kremer, 2004. "Worms: Identifying Impacts on Education and Health in the Presence of Treatment Externalities," Econometrica, Econometric Society, vol. 72(1), pages 159-217, January.
    17. Kerwin, Jason Theodore & Thornton, Rebecca, 2020. "Making the Grade: The Sensitivity of Education Program Effectiveness to Input Choices and Outcome Measures," SocArXiv ct9sj, Center for Open Science.
    18. Omar Al-Ubaydli & John A. List & Danielle LoRe & Dana Suskind, 2017. "Scaling for Economists: Lessons from the Non-Adherence Problem in the Medical Literature," Journal of Economic Perspectives, American Economic Association, vol. 31(4), pages 125-144, Fall.
    19. Omar Al-Ubaydli & John A. List & Dana L. Suskind, 2017. "What Can We Learn from Experiments? Understanding the Threats to the Scalability of Experimental Results," American Economic Review, American Economic Association, vol. 107(5), pages 282-286, May.
    20. Isaiah Andrews & Maximilian Kasy, 2019. "Identification of and Correction for Publication Bias," American Economic Review, American Economic Association, vol. 109(8), pages 2766-2794, August.
    21. Philip Davies, 2012. "The State of Evidence-Based Policy Evaluation and its Role in Policy Formation," National Institute Economic Review, National Institute of Economic and Social Research, vol. 219(1), pages 41-52, January.
    22. James Heckman & Hidehiko Ichimura & Jeffrey Smith & Petra Todd, 1998. "Characterizing Selection Bias Using Experimental Data," Econometrica, Econometric Society, vol. 66(5), pages 1017-1098, September.
    23. Agha Ali Akram & Shyamal Chowdhury & Ahmed Mushfiq Mobarak, 2017. "Effects of Emigration on Rural Labor Markets," NBER Working Papers 23929, National Bureau of Economic Research, Inc.
    24. Luigi Butera & John A. List, 2017. "An Economic Approach to Alleviate the Crises of Confidence in Science: With an Application to the Public Goods Game," NBER Working Papers 23335, National Bureau of Economic Research, Inc.
    25. Deaton, Angus & Cartwright, Nancy, 2018. "Understanding and misunderstanding randomized controlled trials," Social Science & Medicine, Elsevier, vol. 210(C), pages 2-21.
    26. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    27. Karthik Muralidharan & Venkatesh Sundararaman, 2015. "Editor's Choice The Aggregate Effect of School Choice: Evidence from a Two-Stage Experiment in India," The Quarterly Journal of Economics, Oxford University Press, vol. 130(3), pages 1011-1066.
    28. Abhijit Banerjee & Dean Karlan & Jonathan Zinman, 2015. "Six Randomized Evaluations of Microcredit: Introduction and Further Steps," American Economic Journal: Applied Economics, American Economic Association, vol. 7(1), pages 1-21, January.
    29. repec:feb:artefa:0110 is not listed on IDEAS
    30. Fiorina, Morris P. & Plott, Charles R., 1978. "Committee Decisions under Majority Rule: An Experimental Study," American Political Science Review, Cambridge University Press, vol. 72(2), pages 575-598, June.
    31. Gani Aldashev & Georg Kirchsteiger & Alexander Sebald, 2017. "Assignment Procedure Biases in Randomised Policy Experiments," Economic Journal, Royal Economic Society, vol. 127(602), pages 873-895, June.
    32. Christopher Jepsen & Steven Rivkin, 2009. "Class Size Reduction and Student Achievement: The Potential Tradeoff between Teacher Quality and Class Size," Journal of Human Resources, University of Wisconsin Press, vol. 44(1).
    33. Nava Ashraf & Oriana Bandiera & Scott Lee, 2014. "Losing Prosociality in the Quest for Talent? Sorting, Selection, and Productivity in the Delivery of Public Services," STICERD - Economic Organisation and Public Policy Discussion Papers Series 065, Suntory and Toyota International Centres for Economics and Related Disciplines, LSE.
    34. Michael Gilraine & Hugh Macartney & Robert McMillan, 2018. "Estimating the Direct and Indirect Effects of Major Education Reforms," NBER Working Papers 24191, National Bureau of Economic Research, Inc.
    35. Steven D. Levitt & John A. List, 2007. "What Do Laboratory Experiments Measuring Social Preferences Reveal About the Real World?," Journal of Economic Perspectives, American Economic Association, vol. 21(2), pages 153-174, Spring.
    36. Zacharias Maniadis & Fabio Tufano & John A. List, 2014. "One Swallow Doesn't Make a Summer: New Evidence on Anchoring Effects," American Economic Review, American Economic Association, vol. 104(1), pages 277-290, January.
    37. John A. List, 2004. "Neoclassical Theory Versus Prospect Theory: Evidence from the Marketplace," Econometrica, Econometric Society, vol. 72(2), pages 615-625, March.
    38. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    39. Richard A. Bettis, 2012. "The search for asterisks: Compromised statistical tests and flawed theories," Strategic Management Journal, Wiley Blackwell, vol. 33(1), pages 108-113, January.
    40. Hunter, John E, 2001. "The Desperate Need for Replications," Journal of Consumer Research, Oxford University Press, vol. 28(1), pages 149-158, June.
    Full references (including those not matched with items on IDEAS)

    Citations

    Blog mentions

    As found by EconAcademics.org, the blog aggregator for Economics research:
    1. Policymaking Is Not a Science (Yet) (Ep. 405)
      by Stephen J. Dubner in Freakonomics on 2020-02-13 04:00:34

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    2. Omar Al-Ubaydli & John List & Dana Suskind, 2019. "The science of using science: Towards an understanding of the threats to scaling experiments," Artefactual Field Experiments 00670, The Field Experiments Website.
    3. Justman, Moshe, 2018. "Randomized controlled trials informing public policy: Lessons from project STAR and class size reduction," European Journal of Political Economy, Elsevier, vol. 54(C), pages 167-174.
    4. Abhijit Banerjee & Rukmini Banerji & James Berry & Esther Duflo & Harini Kannan & Shobhini Mukerji & Marc Shotland & Michael Walton, 2017. "From Proof of Concept to Scalable Policies: Challenges and Solutions, with an Application," Journal of Economic Perspectives, American Economic Association, vol. 31(4), pages 73-102, Fall.
    5. Omar Al-Ubaydli & John List, 2013. "On the Generalizability of Experimental Results in Economics: With A Response To Camerer," Artefactual Field Experiments j0001, The Field Experiments Website.
    6. Omar Al-Ubaydli & John A. List, 2013. "On the Generalizability of Experimental Results in Economics: With a Response to Commentors," CESifo Working Paper Series 4543, CESifo.
    7. Omar Al-Ubaydli & John List, 2016. "Field Experiments in Markets," Artefactual Field Experiments j0002, The Field Experiments Website.
    8. Jörg Peters & Jörg Langbein & Gareth Roberts, 2018. "Generalization in the Tropics – Development Policy, Randomized Controlled Trials, and External Validity," World Bank Research Observer, World Bank Group, vol. 33(1), pages 34-64.
    9. Michel André Maréchal & Christian Thöni, 2016. "Hidden persuaders: do small gifts lubricate business negotiations?," ECON - Working Papers 227, Department of Economics - University of Zurich, revised May 2018.
    10. James J. Heckman, 1991. "Randomization and Social Policy Evaluation Revisited," NBER Technical Working Papers 0107, National Bureau of Economic Research, Inc.
    11. James J. Heckman, 1991. "Randomization and Social Policy Evaluation Revisited," NBER Technical Working Papers 0107, National Bureau of Economic Research, Inc.
    12. Matteo M. Galizzi & Daniel Navarro Martinez, 2015. "On the external validity of social-preference games: A systematic lab-field study," Economics Working Papers 1462, Department of Economics and Business, Universitat Pompeu Fabra.
    13. Maurizio Canavari & Andreas C. Drichoutis & Jayson L. Lusk & Rodolfo M. Nayga, Jr., 2018. "How to run an experimental auction: A review of recent advances," Working Papers 2018-5, Agricultural University of Athens, Department Of Agricultural Economics.
    14. Lukas Meub & Till Proeger, 2018. "Are groups ‘less behavioral’? The case of anchoring," Theory and Decision, Springer, vol. 85(2), pages 117-150, August.
    15. Jeffrey Smith & Arthur Sweetman, 2016. "Viewpoint: Estimating the causal effects of policies and programs," Canadian Journal of Economics, Canadian Economics Association, vol. 49(3), pages 871-905, August.
    16. List John A., 2007. "Field Experiments: A Bridge between Lab and Naturally Occurring Data," The B.E. Journal of Economic Analysis & Policy, De Gruyter, vol. 5(2), pages 1-47, April.
    17. John A. List & Azeem M. Shaikh & Yang Xu, 2019. "Multiple hypothesis testing in experimental economics," Experimental Economics, Springer;Economic Science Association, vol. 22(4), pages 773-793, December.
    18. Eduard Marinov, 2019. "The 2019 Nobel Prize in Economics," Economic Thought journal, Bulgarian Academy of Sciences - Economic Research Institute, issue 6, pages 78-116.
    19. Ding, Shuze & Lugovskyy, Volodymyr & Puzzello, Daniela & Tucker, Steven & Williams, Arlington, 2018. "Cash versus extra-credit incentives in experimental asset markets," Journal of Economic Behavior & Organization, Elsevier, vol. 150(C), pages 19-27.
    20. Omar Al-Ubaydli & John A. List & Danielle LoRe & Dana Suskind, 2017. "Scaling for Economists: Lessons from the Non-Adherence Problem in the Medical Literature," Journal of Economic Perspectives, American Economic Association, vol. 31(4), pages 125-144, Fall.

    More about this item

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:feb:artefa:00679. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Joe Seidel). General contact details of provider: http://www.fieldexperiments.com .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.