IDEAS home Printed from
MyIDEAS: Login to save this paper or follow this series

Learning about social programs from experiments with random assignment of treatments

  • C. F. Manski

The importance of social programs to a diverse population creates a legitimate concern that the findings of evaluations be widely credible. The weaker are the assumptions imposed, the more widely credible are the findings. The classical argument for random assignment of treatments is viewed by many as enabling evaluation under weak assumptions, and has generated much interest in the conduct of experiments. But the classical argument does impose assumptions, and there often is good reason to doubt their realism. Some researchers, finding the classical assumptions implausible, impose other assumptions strong enough to identify treatment effects of interest. In contrast, the recent literature examined in this article explores the inferences that may be drawn from experimental data under assumptions weak enough to yield widely credible findings. This literature has two branches. One seeks out notions of treatment effect that are identified when the experimental data are combined with weak assumptions. The canonical finding is that the average treatment effect within some context-specific subpopulation is identified. The other branch specifies a population of a priori interest and seeks to learn about treatment effects in this population. Here the canonical finding is a bound on average treatment effects. The various approaches to the analysis of experiments are complementary from a mathematical perspective, but in tension as guides to evaluation practice. The reader of an evaluation reporting that some social program "works" or has "positive impact" should be careful to ascertain what treatment effect has been estimated and under what assumptions.

If you experience problems downloading a file, check if you have the proper application to view it first. In case of further problems read the IDEAS help page. Note that these files are not on the IDEAS site. Please be patient as the files may be large.

File URL:
Download Restriction: no

Paper provided by University of Wisconsin Institute for Research on Poverty in its series Institute for Research on Poverty Discussion Papers with number 1061-95.

in new window

Date of creation:
Date of revision:
Handle: RePEc:wop:wispod:1061-95
Contact details of provider: Postal: 3412 Social Science Building, 1180 Observatory Drive, Madison, WI 53706
Phone: (608) 262-6358
Fax: (608) 265-3119
Web page:

More information through EDIRC

References listed on IDEAS
Please report citation or reference errors to , or , if you are the registered author of the cited work, log in to your RePEc Author Service profile, click on "citations" and make appropriate adjustments.:

as in new window
  1. Aaron, Henry J, 1989. "Politics and the Professors Revisited," American Economic Review, American Economic Association, vol. 79(2), pages 1-15, May.
  2. Charles F. Manski, 1993. "The Mixing Problem in Program Evaluation," NBER Technical Working Papers 0148, National Bureau of Economic Research, Inc.
  3. Alicia H. Munnell, 1987. "Lessons from the income maintenance experiments: an overview," New England Economic Review, Federal Reserve Bank of Boston, issue May, pages 32-45.
  4. V. Joseph Hotz & Seth Sanders, . "Bounding Treatment Effects in Controlled and Natural Experiments Subject to Post-Randomization Treatment Choice," University of Chicago - Population Research Center 94-2, Chicago - Population Research Center.
  5. Jeffrey E. Harris, 1985. "Macroexperiments versus Microexperiments for Health Policy," NBER Chapters, in: Social Experimentation, pages 145-186 National Bureau of Economic Research, Inc.
  6. Heckman, James J. & Robb, Richard Jr., 1985. "Alternative methods for evaluating the impact of interventions : An overview," Journal of Econometrics, Elsevier, vol. 30(1-2), pages 239-267.
  7. repec:att:wimass:9217 is not listed on IDEAS
  8. Manski, C.F., 1990. "The Selection Problem," Working papers 90-12, Wisconsin Madison - Social Systems.
  9. Nancy Clements & James Heckman & Jeffrey Smith, 1994. "Making the Most Out Of Social Experiments: Reducing the Intrinsic Uncertainty in Evidence from Randomized Trials with an Application to the JTPA Exp," NBER Technical Working Papers 0149, National Bureau of Economic Research, Inc.
  10. James J. Heckman, 1991. "Randomization and Social Policy Evaluation," NBER Technical Working Papers 0107, National Bureau of Economic Research, Inc.
  11. Frank Stafford, 1985. "Income-Maintenance Policy and Work Effort: Learning from Experiments and Labor-Market Studies," NBER Chapters, in: Social Experimentation, pages 95-144 National Bureau of Economic Research, Inc.
Full references (including those not matched with items on IDEAS)

This item is not listed on Wikipedia, on a reading list or among the top items on IDEAS.

When requesting a correction, please mention this item's handle: RePEc:wop:wispod:1061-95. See general information about how to correct material in RePEc.

For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Thomas Krichel)

If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

If references are entirely missing, you can add them using this form.

If the full references list an item that is present in RePEc, but the system did not link to it, you can help with this form.

If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your profile, as there may be some citations waiting for confirmation.

Please note that corrections may take a couple of weeks to filter through the various RePEc services.

This information is provided to you by IDEAS at the Research Division of the Federal Reserve Bank of St. Louis using RePEc data.