IDEAS home Printed from https://ideas.repec.org/a/eee/wdevel/v127y2020ics0305750x19304802.html
   My bibliography  Save this article

The implications of a fundamental contradiction in advocating randomized trials for policy

Author

Listed:
  • Muller, Seán M.

Abstract

Ethical concerns aside, there is nothing inherently wrong with using randomized control trials for intellectual inquiry in development economics. A fundamental problem arises, however, in claiming that results from experimental and quasi-experimental methods are more credible than other sources of evidence for policy. Specifically, there is a contradiction between rejecting econometric assumptions required for identifying causal relationships using non-experimental data, and accepting assumptions required for extrapolating experimental results for policy. I explain this tension and its implications, then discuss recent efforts -- including the use of replication and machine learning methods -- to circumvent it. Such attempts remain inadequate, and assertions in the 2019 Nobel Award are therefore either premature or misplaced. Use of pluralistic approaches negates these sharp contradictions, but requires abandoning any special status for experimental methods.

Suggested Citation

  • Muller, Seán M., 2020. "The implications of a fundamental contradiction in advocating randomized trials for policy," World Development, Elsevier, vol. 127(C).
  • Handle: RePEc:eee:wdevel:v:127:y:2020:i:c:s0305750x19304802
    DOI: 10.1016/j.worlddev.2019.104831
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0305750X19304802
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.worlddev.2019.104831?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. James J. Heckman & Jeffrey A. Smith, 1995. "Assessing the Case for Social Experiments," Journal of Economic Perspectives, American Economic Association, vol. 9(2), pages 85-110, Spring.
    2. Michael A. Clemens, 2017. "The Meaning Of Failed Replications: A Review And Proposal," Journal of Economic Surveys, Wiley Blackwell, vol. 31(1), pages 326-342, February.
    3. Joshua D. Angrist & Jörn-Steffen Pischke, 2010. "The Credibility Revolution in Empirical Economics: How Better Research Design Is Taking the Con out of Econometrics," Journal of Economic Perspectives, American Economic Association, vol. 24(2), pages 3-30, Spring.
    4. Gary Solon & Steven J. Haider & Jeffrey M. Wooldridge, 2015. "What Are We Weighting For?," Journal of Human Resources, University of Wisconsin Press, vol. 50(2), pages 301-316.
    5. Seán M. Muller, 2015. "Causal Interaction and External Validity: Obstacles to the Policy Relevance of Randomized Evaluations," The World Bank Economic Review, World Bank, vol. 29(suppl_1), pages 217-225.
    6. Athey, Susan & Imbens, Guido W., 2015. "Machine Learning for Estimating Heterogeneous Causal Effects," Research Papers 3350, Stanford University, Graduate School of Business.
    7. Deaton, Angus & Cartwright, Nancy, 2018. "Understanding and misunderstanding randomized controlled trials," Social Science & Medicine, Elsevier, vol. 210(C), pages 2-21.
    8. Charles F. Manski, 2011. "Policy Analysis with Incredible Certitude," Economic Journal, Royal Economic Society, vol. 121(554), pages 261-289, August.
    9. Muller, Sean, 2014. "Randomised trials for policy: a review of the external validity of treatment effects," SALDRU Working Papers 127, Southern Africa Labour and Development Research Unit, University of Cape Town.
    10. Victor Chernozhukov & Mert Demirer & Esther Duflo & Iván Fernández-Val, 2018. "Generic Machine Learning Inference on Heterogeneous Treatment Effects in Randomized Experiments, with an Application to Immunization in India," NBER Working Papers 24678, National Bureau of Economic Research, Inc.
    11. Elizabeth A. Stuart & Stephen R. Cole & Catherine P. Bradshaw & Philip J. Leaf, 2011. "The use of propensity scores to assess the generalizability of results from randomized trials," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 174(2), pages 369-386, April.
    12. Joseph Hotz, V. & Imbens, Guido W. & Mortimer, Julie H., 2005. "Predicting the efficacy of future training programs using past experiences at other locations," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 241-270.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Denis Fougère & Nicolas Jacquemet, 2020. "Policy Evaluation Using Causal Inference Methods," SciencePo Working papers Main hal-03455978, HAL.
    2. Hunt Allcott, 2012. "Site Selection Bias in Program Evaluation," NBER Working Papers 18373, National Bureau of Economic Research, Inc.
    3. Burt S. Barnow & Jeffrey Smith, 2015. "Employment and Training Programs," NBER Chapters, in: Economics of Means-Tested Transfer Programs in the United States, Volume 2, pages 127-234, National Bureau of Economic Research, Inc.
    4. Jeffrey Smith, 2022. "Treatment Effect Heterogeneity," Evaluation Review, , vol. 46(5), pages 652-677, October.
    5. Esterling, Kevin & Brady, David & Schwitzgebel, Eric, 2021. "The Necessity of Construct and External Validity for Generalized Causal Claims," OSF Preprints 2s8w5, Center for Open Science.
    6. Christopher J. Ruhm, 2019. "Shackling the Identification Police?," Southern Economic Journal, John Wiley & Sons, vol. 85(4), pages 1016-1026, April.
    7. Jörg Peters & Jörg Langbein & Gareth Roberts, 2018. "Generalization in the Tropics – Development Policy, Randomized Controlled Trials, and External Validity," The World Bank Research Observer, World Bank, vol. 33(1), pages 34-64.
    8. Adesina, Adedoyin & Akogun, Oladele & Dillon, Andrew & Friedman, Jed & Njobdi, Sani & Serneels, Pieter, 2017. "Robustness and External Validity: What do we Learn from Repeated Study Designs over Time?," 2018 Allied Social Sciences Association (ASSA) Annual Meeting, January 5-7, 2018, Philadelphia, Pennsylvania 266292, Agricultural and Applied Economics Association.
    9. Jens Ludwig & Jeffrey R. Kling & Sendhil Mullainathan, 2011. "Mechanism Experiments and Policy Evaluations," Journal of Economic Perspectives, American Economic Association, vol. 25(3), pages 17-38, Summer.
    10. Ashley L. Buchanan & Michael G. Hudgens & Stephen R. Cole & Katie R. Mollan & Paul E. Sax & Eric S. Daar & Adaora A. Adimora & Joseph J. Eron & Michael J. Mugavero, 2018. "Generalizing evidence from randomized trials using inverse probability of sampling weights," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 181(4), pages 1193-1209, October.
    11. Esterling, Kevin M. & Brady, David & Schwitzgebel, Eric, 2023. "The Necessity of Construct and External Validity for Generalized Causal Claims," I4R Discussion Paper Series 18, The Institute for Replication (I4R).
    12. W. Bentley MacLeod, 2017. "Viewpoint: The human capital approach to inference," Canadian Journal of Economics, Canadian Economics Association, vol. 50(1), pages 5-39, February.
    13. Andor, Mark A. & Gerster, Andreas & Peters, Jörg & Schmidt, Christoph M., 2020. "Social Norms and Energy Conservation Beyond the US," Journal of Environmental Economics and Management, Elsevier, vol. 103(C).
    14. Seán M. Muller, 2021. "Evidence for a YETI? A Cautionary Tale from South Africa's Youth Employment Tax Incentive," Development and Change, International Institute of Social Studies, vol. 52(6), pages 1301-1342, November.
    15. Ashis Das & Jed Friedman & Eeshani Kandpal, 2018. "Does involvement of local NGOs enhance public service delivery? Cautionary evidence from a malaria‐prevention program in India," Health Economics, John Wiley & Sons, Ltd., vol. 27(1), pages 172-188, January.
    16. Donald Moynihan, 2018. "A great schism approaching? Towards a micro and macro public administration," Journal of Behavioral Public Administration, Center for Experimental and Behavioral Public Administration, vol. 1(1).
    17. Ralitza Dimova, 2019. "A Debate that Fatigues…: To Randomise or Not to Randomise; What’s the Real Question?," The European Journal of Development Research, Palgrave Macmillan;European Association of Development Research and Training Institutes (EADI), vol. 31(2), pages 163-168, April.
    18. Ankel-Peters, Jörg & Schmidt, Christoph M., 2023. "Rural electrification, the credibility revolution, and the limits of evidence-based policy," Ruhr Economic Papers 1051, RWI - Leibniz-Institut für Wirtschaftsforschung, Ruhr-University Bochum, TU Dortmund University, University of Duisburg-Essen.
    19. Deaton, Angus & Cartwright, Nancy, 2018. "Understanding and misunderstanding randomized controlled trials," Social Science & Medicine, Elsevier, vol. 210(C), pages 2-21.
    20. Naoki Egami & Erin Hartman, 2021. "Covariate selection for generalizing experimental results: Application to a large‐scale development program in Uganda," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 184(4), pages 1524-1548, October.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:wdevel:v:127:y:2020:i:c:s0305750x19304802. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/worlddev .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.