IDEAS home Printed from https://ideas.repec.org/p/wop/wispod/1061-95.html
   My bibliography  Save this paper

Learning about social programs from experiments with random assignment of treatments

Author

Listed:
  • C. F. Manski

Abstract

The importance of social programs to a diverse population creates a legitimate concern that the findings of evaluations be widely credible. The weaker are the assumptions imposed, the more widely credible are the findings. The classical argument for random assignment of treatments is viewed by many as enabling evaluation under weak assumptions, and has generated much interest in the conduct of experiments. But the classical argument does impose assumptions, and there often is good reason to doubt their realism. Some researchers, finding the classical assumptions implausible, impose other assumptions strong enough to identify treatment effects of interest. In contrast, the recent literature examined in this article explores the inferences that may be drawn from experimental data under assumptions weak enough to yield widely credible findings. This literature has two branches. One seeks out notions of treatment effect that are identified when the experimental data are combined with weak assumptions. The canonical finding is that the average treatment effect within some context-specific subpopulation is identified. The other branch specifies a population of a priori interest and seeks to learn about treatment effects in this population. Here the canonical finding is a bound on average treatment effects. The various approaches to the analysis of experiments are complementary from a mathematical perspective, but in tension as guides to evaluation practice. The reader of an evaluation reporting that some social program "works" or has "positive impact" should be careful to ascertain what treatment effect has been estimated and under what assumptions.

Suggested Citation

  • C. F. Manski, "undated". "Learning about social programs from experiments with random assignment of treatments," Institute for Research on Poverty Discussion Papers 1061-95, University of Wisconsin Institute for Research on Poverty.
  • Handle: RePEc:wop:wispod:1061-95
    as

    Download full text from publisher

    File URL: http://www.irp.wisc.edu/publications/dps/pdfs/dp106195.pdf
    Download Restriction: no

    Other versions of this item:

    References listed on IDEAS

    as
    1. Manski, C.F., 1992. "Identification Problems in the Social Sciences," Working papers 9217, Wisconsin Madison - Social Systems.
    2. Manski, C.F., 1990. "The Selection Problem," Working papers 90-12, Wisconsin Madison - Social Systems.
    3. Charles F. Manski, 1997. "The Mixing Problem in Programme Evaluation," Review of Economic Studies, Oxford University Press, vol. 64(4), pages 537-553.
    4. V. Joseph Hotz & Seth Sanders, "undated". "Bounding Treatment Effects in Controlled and Natural Experiments Subject to Post-Randomization Treatment Choice," University of Chicago - Population Research Center 94-2, Chicago - Population Research Center.
    5. Aaron, Henry J, 1989. "Politics and the Professors Revisited," American Economic Review, American Economic Association, vol. 79(2), pages 1-15, May.
    6. Nancy Clements & James Heckman & Jeffrey Smith, 1994. "Making the Most Out Of Social Experiments: Reducing the Intrinsic Uncertainty in Evidence from Randomized Trials with an Application to the JTPA Exp," NBER Technical Working Papers 0149, National Bureau of Economic Research, Inc.
    7. Frank Stafford, 1985. "Income-Maintenance Policy and Work Effort: Learning from Experiments and Labor-Market Studies," NBER Chapters,in: Social Experimentation, pages 95-144 National Bureau of Economic Research, Inc.
    8. Alicia H. Munnell, 1987. "Lessons from the income maintenance experiments: an overview," New England Economic Review, Federal Reserve Bank of Boston, issue May, pages 32-45.
    9. Jeffrey E. Harris, 1985. "Macroexperiments versus Microexperiments for Health Policy," NBER Chapters,in: Social Experimentation, pages 145-186 National Bureau of Economic Research, Inc.
    10. James J. Heckman, 1991. "Randomization and Social Policy Evaluation," NBER Technical Working Papers 0107, National Bureau of Economic Research, Inc.
    11. Heckman, James J. & Robb, Richard Jr., 1985. "Alternative methods for evaluating the impact of interventions : An overview," Journal of Econometrics, Elsevier, vol. 30(1-2), pages 239-267.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Meyer, Bruce D, 1995. "Natural and Quasi-experiments in Economics," Journal of Business & Economic Statistics, American Statistical Association, vol. 13(2), pages 151-161, April.
    2. Levitt, Steven D. & List, John A., 2009. "Field experiments in economics: The past, the present, and the future," European Economic Review, Elsevier, vol. 53(1), pages 1-18, January.
    3. Mark Schreiner, 2001. "Evaluation and Microenterprise Programs," Development and Comp Systems 0108002, EconWPA, revised 27 Dec 2001.
    4. Cristian Aedo, "undated". "The Impact of Training Policies in Latin America and the Caribbean: The Case of "Programa Joven"," ILADES-Georgetown University Working Papers inv131, Ilades-Georgetown University, Universidad Alberto Hurtado/School of Economics and Bussines.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:wop:wispod:1061-95. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Thomas Krichel). General contact details of provider: http://edirc.repec.org/data/iruwius.html .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.