IDEAS home Printed from https://ideas.repec.org/
MyIDEAS: Login to save this paper or follow this series

Instruments, randomization, and learning about development

  • Angus Deaton

    (Princeton University)

There is currently much debate about the effectiveness of foreign aid and about what kind of projects can engender economic development. There is skepticism about the ability of econometric analysis to resolve these issues, or of development agencies to learn from their own experience. In response, there is increasing use in development economics of randomized controlled trials (RCTs) to accumulate credible knowledge of what works, without over-reliance on questionable theory or statistical methods. When RCTs are not possible, the proponents of these methods advocate quasi-randomization through instrumental variable (IV) techniques or natural experiments. I argue that many of these applications are unlikely to recover quantities that are useful for policy or understanding: two key issues are the misunderstanding of exogeneity, and the handling of heterogeneity. I illustrate from the literature on aid and growth. Actual randomization faces similar problems as does quasi-randomization, notwithstanding rhetoric to the contrary. I argue that experiments have no special ability to produce more credible knowledge than other methods, and that actual experiments are frequently subject to practical problems that undermine any claims to statistical or epistemic superiority. I illustrate using prominent experiments in development and elsewhere. As with IV methods, RCT-based evaluation of projects, without guidance from an understanding of underlying mechanisms, is unlikely to lead to scientific progress in the understanding of economic development. I welcome recent trends in development experimentation away from the evaluation of projects and towards the evaluation of theoretical mechanisms.

If you experience problems downloading a file, check if you have the proper application to view it first. In case of further problems read the IDEAS help page. Note that these files are not on the IDEAS site. Please be patient as the files may be large.

File URL: http://www.princeton.edu/rpds/papers/Deaton_Instruments_randomization_learning_all_04April_2010.pdf
Download Restriction: no

Paper provided by Princeton University, Woodrow Wilson School of Public and International Affairs, Research Program in Development Studies. in its series Working Papers with number 1224.

as
in new window

Length:
Date of creation: Mar 2010
Date of revision:
Handle: RePEc:pri:rpdevs:deaton_instruments_randomization_learning_all_04april_2010
Contact details of provider: Postal: 208 Fisher Hall, Princeton, NJ 08544
Phone: (609) 258 - 6403
Fax: (609) 258 - 5974
Web page: http://www.princeton.edu/%7Erpds/index.html

More information through EDIRC

References listed on IDEAS
Please report citation or reference errors to , or , if you are the registered author of the cited work, log in to your RePEc Author Service profile, click on "citations" and make appropriate adjustments.:

as in new window
  1. Caroline M. Hoxby, 2000. "Does Competition among Public Schools Benefit Students and Taxpayers?," American Economic Review, American Economic Association, vol. 90(5), pages 1209-1238, December.
  2. Joshua D. Angrist & Guido W. Imbens, 1995. "Identification and Estimation of Local Average Treatment Effects," NBER Technical Working Papers 0118, National Bureau of Economic Research, Inc.
  3. Patrick GUILLAUMONT & Lisa CHAUVET, 1999. "Aid and Performance: A Reassessment," Working Papers 199910, CERDI.
  4. James Heckman, 1997. "Instrumental Variables: A Study of Implicit Behavioral Assumptions Used in Making Program Evaluations," Journal of Human Resources, University of Wisconsin Press, vol. 32(3), pages 441-462.
  5. William Easterly, 2008. "Can the West Save Africa?," NBER Working Papers 14363, National Bureau of Economic Research, Inc.
  6. James J. Heckman, 2000. "Causal Parameters And Policy Analysis In Economics: A Twentieth Century Retrospective," The Quarterly Journal of Economics, MIT Press, vol. 115(1), pages 45-97, February.
  7. Richard Blundell & Monica Costa Dias, 2008. "Alternative approaches to evaluation in empirical microeconomics," CeMMAP working papers CWP26/08, Centre for Microdata Methods and Practice, Institute for Fiscal Studies.
  8. Lisa Sanbonmatsu & Jeffrey R. Kling & Greg J. Duncan & Jeanne Brooks-Gunn, 2006. "Neighborhoods and Academic Achievement: Results from the Moving to Opportunity Experiment," NBER Working Papers 11909, National Bureau of Economic Research, Inc.
  9. McCrary, Justin, 2008. "Manipulation of the running variable in the regression discontinuity design: A density test," Journal of Econometrics, Elsevier, vol. 142(2), pages 698-714, February.
  10. Joshua Angrist & Jörn-Steffen Pischke, 2010. "The Credibility Revolution in Empirical Economics: How Better Research Design is Taking the Con out of Econometrics," NBER Working Papers 15794, National Bureau of Economic Research, Inc.
  11. Levitt, Steven D. & List, John A., 2009. "Field experiments in economics: The past, the present, and the future," European Economic Review, Elsevier, vol. 53(1), pages 1-18, January.
  12. Angrist, Joshua D, 1990. "Lifetime Earnings and the Vietnam Era Draft Lottery: Evidence from Social Security Administrative Records," American Economic Review, American Economic Association, vol. 80(3), pages 313-36, June.
  13. Raghuram G. Rajan & Arvind Subramanian, 2008. "Aid and Growth: What Does the Cross-Country Evidence Really Show?," The Review of Economics and Statistics, MIT Press, vol. 90(4), pages 643-665, November.
  14. John A. List, 2007. "Field Experiments: A Bridge Between Lab and Naturally-Occurring Data," NBER Working Papers 12992, National Bureau of Economic Research, Inc.
  15. repec:feb:artefa:0087 is not listed on IDEAS
  16. Marianne Bertrand & Dean Karlan & Sendhil Mullainathan & Eldar Shafir & Jonathan Zinman, 2010. "What's Advertising Content Worth? Evidence from a Consumer Credit Marketing Field Experiment," The Quarterly Journal of Economics, MIT Press, vol. 125(1), pages 263-305, February.
  17. James J. Heckman & Jeffrey A. Smith, 1995. "Assessing the Case for Social Experiments," Journal of Economic Perspectives, American Economic Association, vol. 9(2), pages 85-110, Spring.
  18. Angrist, Joshua D, 1990. "Lifetime Earnings and the Vietnam Era Draft Lottery: Evidence from Social Security Administrative Records: Errata," American Economic Review, American Economic Association, vol. 80(5), pages 1284-86, December.
  19. Christopher A. Sims, 2010. "But Economics Is Not an Experimental Science," Journal of Economic Perspectives, American Economic Association, vol. 24(2), pages 59-68, Spring.
  20. Kenneth I. Wolpin & Petra E. Todd, 2006. "Assessing the Impact of a School Subsidy Program in Mexico: Using a Social Experiment to Validate a Dynamic Behavioral Model of Child Schooling and Fertility," American Economic Review, American Economic Association, vol. 96(5), pages 1384-1417, December.
  21. Suresh de Mel & David McKenzie & Christopher Woodruff, 2008. "Returns to Capital in Microenterprises: Evidence from a Field Experiment," The Quarterly Journal of Economics, MIT Press, vol. 123(4), pages 1329-1372, November.
  22. Boone, Peter, 1996. "Politics and the effectiveness of foreign aid," European Economic Review, Elsevier, vol. 40(2), pages 289-329, February.
  23. Charles F. Manski, 1996. "Learning about Treatment Effects from Experiments with Random Assignment of Treatments," Journal of Human Resources, University of Wisconsin Press, vol. 31(4), pages 709-733.
  24. Robert J. Barro, 1995. "Inflation and Economic Growth," NBER Working Papers 5326, National Bureau of Economic Research, Inc.
  25. R. Lensink & H. White, 2001. "Are There Negative Returns to Aid?," Journal of Development Studies, Taylor & Francis Journals, vol. 37(6), pages 42-65.
  26. Edward Miguel & Shanker Satyanath & Ernest Sergenti, 2004. "Economic Shocks and Civil Conflict: An Instrumental Variables Approach," Journal of Political Economy, University of Chicago Press, vol. 112(4), pages 725-753, August.
Full references (including those not matched with items on IDEAS)

This item is not listed on Wikipedia, on a reading list or among the top items on IDEAS.

When requesting a correction, please mention this item's handle: RePEc:pri:rpdevs:deaton_instruments_randomization_learning_all_04april_2010. See general information about how to correct material in RePEc.

For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (David Long)

If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

If references are entirely missing, you can add them using this form.

If the full references list an item that is present in RePEc, but the system did not link to it, you can help with this form.

If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your profile, as there may be some citations waiting for confirmation.

Please note that corrections may take a couple of weeks to filter through the various RePEc services.

This information is provided to you by IDEAS at the Research Division of the Federal Reserve Bank of St. Louis using RePEc data.