IDEAS home Printed from https://ideas.repec.org/p/pri/rpdevs/deaton_instruments_randomization_learning_all_04april_2010.pdf.html
   My bibliography  Save this paper

Instruments, randomization, and learning about development

Author

Listed:
  • Angus Deaton

    (Princeton University)

Abstract

There is currently much debate about the effectiveness of foreign aid and about what kind of projects can engender economic development. There is skepticism about the ability of econometric analysis to resolve these issues, or of development agencies to learn from their own experience. In response, there is increasing use in development economics of randomized controlled trials (RCTs) to accumulate credible knowledge of what works, without over-reliance on questionable theory or statistical methods. When RCTs are not possible, the proponents of these methods advocate quasi-randomization through instrumental variable (IV) techniques or natural experiments. I argue that many of these applications are unlikely to recover quantities that are useful for policy or understanding: two key issues are the misunderstanding of exogeneity, and the handling of heterogeneity. I illustrate from the literature on aid and growth. Actual randomization faces similar problems as does quasi-randomization, notwithstanding rhetoric to the contrary. I argue that experiments have no special ability to produce more credible knowledge than other methods, and that actual experiments are frequently subject to practical problems that undermine any claims to statistical or epistemic superiority. I illustrate using prominent experiments in development and elsewhere. As with IV methods, RCT-based evaluation of projects, without guidance from an understanding of underlying mechanisms, is unlikely to lead to scientific progress in the understanding of economic development. I welcome recent trends in development experimentation away from the evaluation of projects and towards the evaluation of theoretical mechanisms.

Suggested Citation

  • Angus Deaton, 2010. "Instruments, randomization, and learning about development," Working Papers 1224, Princeton University, Woodrow Wilson School of Public and International Affairs, Research Program in Development Studies..
  • Handle: RePEc:pri:rpdevs:deaton_instruments_randomization_learning_all_04april_2010.pdf
    as

    Download full text from publisher

    File URL: https://rpds.princeton.edu/sites/rpds/files/media/deaton_instruments_randomization_and_learning_about_development_jel.pdf
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Esther Duflo, 2005. "Monitoring Works: Getting Teachers to Come to School," Working Papers id:301, eSocialSciences.
    2. Joshua D. Angrist & Jörn-Steffen Pischke, 2010. "The Credibility Revolution in Empirical Economics: How Better Research Design Is Taking the Con out of Econometrics," Journal of Economic Perspectives, American Economic Association, vol. 24(2), pages 3-30, Spring.
    3. James J. Heckman & Vytlacil, Edward J., 2007. "Econometric Evaluation of Social Programs, Part II: Using the Marginal Treatment Effect to Organize Alternative Econometric Estimators to Evaluate Social Programs, and to Forecast their Effects in New," Handbook of Econometrics, in: J.J. Heckman & E.E. Leamer (ed.), Handbook of Econometrics, edition 1, volume 6, chapter 71, Elsevier.
    4. Robert J. Barro, 2013. "Inflation and Economic Growth," Annals of Economics and Finance, Society for AEF, vol. 14(1), pages 121-144, May.
    5. Robert J. Barro, 1998. "Determinants of Economic Growth: A Cross-Country Empirical Study," MIT Press Books, The MIT Press, edition 1, volume 1, number 0262522543, December.
    6. Richard Blundell & Monica Costa Dias, 2009. "Alternative Approaches to Evaluation in Empirical Microeconomics," Journal of Human Resources, University of Wisconsin Press, vol. 44(3).
    7. Caroline M. Hoxby, 2000. "Does Competition among Public Schools Benefit Students and Taxpayers?," American Economic Review, American Economic Association, vol. 90(5), pages 1209-1238, December.
    8. Lisa Sanbonmatsu & Jeffrey R. Kling & Greg J. Duncan & Jeanne Brooks-Gunn, 2006. "Neighborhoods and Academic Achievement: Results from the Moving to Opportunity Experiment," Journal of Human Resources, University of Wisconsin Press, vol. 41(4).
    9. P. Guillaumont & L. Chauvet, 2001. "Aid and Performance: A Reassessment," Journal of Development Studies, Taylor & Francis Journals, vol. 37(6), pages 66-92.
    10. James J. Heckman & Jeffrey A. Smith, 1995. "Assessing the Case for Social Experiments," Journal of Economic Perspectives, American Economic Association, vol. 9(2), pages 85-110, Spring.
    11. Angrist, Joshua D, 1990. "Lifetime Earnings and the Vietnam Era Draft Lottery: Evidence from Social Security Administrative Records," American Economic Review, American Economic Association, vol. 80(3), pages 313-336, June.
    12. List John A., 2007. "Field Experiments: A Bridge between Lab and Naturally Occurring Data," The B.E. Journal of Economic Analysis & Policy, De Gruyter, vol. 5(2), pages 1-47, April.
    13. Xavier Giné & Dean Karlan & Jonathan Zinman, 2010. "Put Your Money Where Your Butt Is: A Commitment Contract for Smoking Cessation," American Economic Journal: Applied Economics, American Economic Association, vol. 2(4), pages 213-235, October.
    14. Heckman, James J. & Urzúa, Sergio, 2010. "Comparing IV with structural models: What simple IV can and cannot identify," Journal of Econometrics, Elsevier, vol. 156(1), pages 27-37, May.
    15. R. Lensink & H. White, 2001. "Are There Negative Returns to Aid?," Journal of Development Studies, Taylor & Francis Journals, vol. 37(6), pages 42-65.
    16. O. Ashenfelter & D. Card (ed.), 1999. "Handbook of Labor Economics," Handbook of Labor Economics, Elsevier, edition 1, volume 3, number 3.
    17. Kenneth I. Wolpin & Petra E. Todd, 2006. "Assessing the Impact of a School Subsidy Program in Mexico: Using a Social Experiment to Validate a Dynamic Behavioral Model of Child Schooling and Fertility," American Economic Review, American Economic Association, vol. 96(5), pages 1384-1417, December.
    18. List, John A. & Rasul, Imran, 2011. "Field Experiments in Labor Economics," Handbook of Labor Economics, in: O. Ashenfelter & D. Card (ed.), Handbook of Labor Economics, edition 1, volume 4, chapter 2, pages 103-228, Elsevier.
    19. Imbens, Guido W & Angrist, Joshua D, 1994. "Identification and Estimation of Local Average Treatment Effects," Econometrica, Econometric Society, vol. 62(2), pages 467-475, March.
    20. Glewwe, Paul & Kremer, Michael & Moulin, Sylvie & Zitzewitz, Eric, 2004. "Retrospective vs. prospective analyses of school inputs: the case of flip charts in Kenya," Journal of Development Economics, Elsevier, vol. 74(1), pages 251-268, June.
    21. James J. Heckman & Sergio Urzua & Edward Vytlacil, 2006. "Understanding Instrumental Variables in Models with Essential Heterogeneity," The Review of Economics and Statistics, MIT Press, vol. 88(3), pages 389-432, August.
    22. William Easterly, 2009. "Can the West Save Africa?," Journal of Economic Literature, American Economic Association, vol. 47(2), pages 373-447, June.
    23. Michael A. Clemens & Steven Radelet & Rikhil Bhavnani, 2004. "Counting chickens when they hatch: The short-term effect of aid on growth," International Finance 0407010, University Library of Munich, Germany.
    24. Raghuram G. Rajan & Arvind Subramanian, 2008. "Aid and Growth: What Does the Cross-Country Evidence Really Show?," The Review of Economics and Statistics, MIT Press, vol. 90(4), pages 643-665, November.
    25. Reiss, Peter C. & Wolak, Frank A., 2007. "Structural Econometric Modeling: Rationales and Examples from Industrial Organization," Handbook of Econometrics, in: J.J. Heckman & E.E. Leamer (ed.), Handbook of Econometrics, edition 1, volume 6, chapter 64, Elsevier.
    26. Suresh de Mel & David McKenzie & Christopher Woodruff, 2009. "Returns to Capital in Microenterprises: Evidence from a Field Experiment," The Quarterly Journal of Economics, Oxford University Press, vol. 124(1), pages 423-423.
    27. Guido W. Imbens, 2010. "Better LATE Than Nothing: Some Comments on Deaton (2009) and Heckman and Urzua (2009)," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 399-423, June.
    28. William Easterly (ed.), 2008. "Reinventing Foreign Aid," MIT Press Books, The MIT Press, edition 1, volume 1, number 0262550660, December.
    29. Leamer, Edward E., 1985. "Vector autoregressions for causal inference?," Carnegie-Rochester Conference Series on Public Policy, Elsevier, vol. 22(1), pages 255-304, January.
    30. Angrist, Joshua D, 1990. "Lifetime Earnings and the Vietnam Era Draft Lottery: Evidence from Social Security Administrative Records: Errata," American Economic Review, American Economic Association, vol. 80(5), pages 1284-1286, December.
    31. repec:feb:artefa:0087 is not listed on IDEAS
    32. McCrary, Justin, 2008. "Manipulation of the running variable in the regression discontinuity design: A density test," Journal of Econometrics, Elsevier, vol. 142(2), pages 698-714, February.
    33. Charles F. Manski, 1996. "Learning about Treatment Effects from Experiments with Random Assignment of Treatments," Journal of Human Resources, University of Wisconsin Press, vol. 31(4), pages 709-733.
    34. Levitt, Steven D. & List, John A., 2009. "Field experiments in economics: The past, the present, and the future," European Economic Review, Elsevier, vol. 53(1), pages 1-18, January.
    35. Edward Miguel & Michael Kremer, 2004. "Worms: Identifying Impacts on Education and Health in the Presence of Treatment Externalities," Econometrica, Econometric Society, vol. 72(1), pages 159-217, January.
    36. Boone, Peter, 1996. "Politics and the effectiveness of foreign aid," European Economic Review, Elsevier, vol. 40(2), pages 289-329, February.
    37. Dean S. Karlan & Jonathan Zinman, 2008. "Credit Elasticities in Less-Developed Economies: Implications for Microfinance," American Economic Review, American Economic Association, vol. 98(3), pages 1040-1068, June.
    38. James J. Heckman, 2000. "Causal Parameters and Policy Analysis in Economics: A Twentieth Century Retrospective," The Quarterly Journal of Economics, Oxford University Press, vol. 115(1), pages 45-97.
    39. James Heckman, 1997. "Instrumental Variables: A Study of Implicit Behavioral Assumptions Used in Making Program Evaluations," Journal of Human Resources, University of Wisconsin Press, vol. 32(3), pages 441-462.
    40. Christopher A. Sims, 2010. "But Economics Is Not an Experimental Science," Journal of Economic Perspectives, American Economic Association, vol. 24(2), pages 59-68, Spring.
    41. Edward Miguel & Shanker Satyanath & Ernest Sergenti, 2004. "Economic Shocks and Civil Conflict: An Instrumental Variables Approach," Journal of Political Economy, University of Chicago Press, vol. 112(4), pages 725-753, August.
    42. Joshua D. Angrist & Victor Lavy, 1999. "Using Maimonides' Rule to Estimate the Effect of Class Size on Scholastic Achievement," The Quarterly Journal of Economics, Oxford University Press, vol. 114(2), pages 533-575.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Angus S. Deaton, 2009. "Instruments of development: Randomization in the tropics, and the search for the elusive keys to economic development," NBER Working Papers 14690, National Bureau of Economic Research, Inc.
    2. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    3. Guido W. Imbens, 2010. "Better LATE Than Nothing: Some Comments on Deaton (2009) and Heckman and Urzua (2009)," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 399-423, June.
    4. James J. Heckman, 2010. "Building Bridges between Structural and Program Evaluation Approaches to Evaluating Policy," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 356-398, June.
    5. Peter Hull & Michal Kolesár & Christopher Walters, 2022. "Labor by design: contributions of David Card, Joshua Angrist, and Guido Imbens," Scandinavian Journal of Economics, Wiley Blackwell, vol. 124(3), pages 603-645, July.
    6. Joshua D. Angrist & Jörn-Steffen Pischke, 2010. "The Credibility Revolution in Empirical Economics: How Better Research Design Is Taking the Con out of Econometrics," Journal of Economic Perspectives, American Economic Association, vol. 24(2), pages 3-30, Spring.
    7. Cornelissen, Thomas & Dustmann, Christian & Raute, Anna & Schönberg, Uta, 2016. "From LATE to MTE: Alternative methods for the evaluation of policy interventions," Labour Economics, Elsevier, vol. 41(C), pages 47-60.
    8. Committee, Nobel Prize, 2021. "Answering causal questions using observational data," Nobel Prize in Economics documents 2021-2, Nobel Prize Committee.
    9. Sebastian Galiani & Juan Pantano, 2021. "Structural Models: Inception and Frontier," NBER Working Papers 28698, National Bureau of Economic Research, Inc.
    10. Dionissi Aliprantis, 2013. "Covariates and causal effects: the problem of context," Working Papers (Old Series) 1310, Federal Reserve Bank of Cleveland.
    11. Huber Martin & Wüthrich Kaspar, 2019. "Local Average and Quantile Treatment Effects Under Endogeneity: A Review," Journal of Econometric Methods, De Gruyter, vol. 8(1), pages 1-27, January.
    12. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    13. Duflo, Esther & Glennerster, Rachel & Kremer, Michael, 2008. "Using Randomization in Development Economics Research: A Toolkit," Handbook of Development Economics, in: T. Paul Schultz & John A. Strauss (ed.), Handbook of Development Economics, edition 1, volume 4, chapter 61, pages 3895-3962, Elsevier.
    14. Abhijit V. Banerjee & Esther Duflo, 2009. "The Experimental Approach to Development Economics," Annual Review of Economics, Annual Reviews, vol. 1(1), pages 151-178, May.
    15. Dionissi Aliprantis, 2011. "Assessing the evidence on neighborhood effects from moving to opportunity," Working Papers (Old Series) 1101, Federal Reserve Bank of Cleveland.
    16. Hunt Allcott, 2012. "Site Selection Bias in Program Evaluation," NBER Working Papers 18373, National Bureau of Economic Research, Inc.
    17. William Easterly, 2009. "Can the West Save Africa?," Journal of Economic Literature, American Economic Association, vol. 47(2), pages 373-447, June.
    18. Schultz, T. Paul, 2010. "Population and Health Policies," Handbook of Development Economics, in: Dani Rodrik & Mark Rosenzweig (ed.), Handbook of Development Economics, edition 1, volume 5, chapter 0, pages 4785-4881, Elsevier.
    19. John DiNardo & David S. Lee, 2010. "Program Evaluation and Research Designs," Working Papers 1228, Princeton University, Department of Economics, Industrial Relations Section..
    20. Guido W. Imbens, 2020. "Potential Outcome and Directed Acyclic Graph Approaches to Causality: Relevance for Empirical Practice in Economics," Journal of Economic Literature, American Economic Association, vol. 58(4), pages 1129-1179, December.

    More about this item

    Keywords

    Randomized controlled trials; mechanisms; instrumental variables; development; foreign aid; growth; poverty reduction;
    All these keywords.

    JEL classification:

    • C01 - Mathematical and Quantitative Methods - - General - - - Econometrics
    • C80 - Mathematical and Quantitative Methods - - Data Collection and Data Estimation Methodology; Computer Programs - - - General
    • D60 - Microeconomics - - Welfare Economics - - - General
    • F35 - International Economics - - International Finance - - - Foreign Aid
    • I32 - Health, Education, and Welfare - - Welfare, Well-Being, and Poverty - - - Measurement and Analysis of Poverty

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:pri:rpdevs:deaton_instruments_randomization_learning_all_04april_2010.pdf. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Bobray Bordelon (email available below). General contact details of provider: https://edirc.repec.org/data/rpprius.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.