IDEAS home Printed from https://ideas.repec.org/p/att/wimass/9505.html
   My bibliography  Save this paper

Learning About Social Programs from Experiments with Random Assignment of Treatments

Author

Listed:
  • Manski, C.F.

Abstract

The importance of social programs to a diverse population creates a legitimate concern that the findings of evaluations be widely credible. The weaker are the assumptions imposed, the more widely credible are the findings. The classical argument for random assignment of treatments is viewed by many as enabling evaluation under weak assumptions, and has generated much interest in the conduct of experiments. But the classical argument does impose assumptions, and there often is good reason to doubt their realism. Some researchers, finding the classical assumptions implausible, impose other assumptions strong enough to identify treatment effects of interest. In contrast, the recent literature examined in this article explores the inferences that may be drawn from experimental data under assumptions weak enough to yield widely credible findings. This literature has two branches. One seeks out notions of treatment effect that are identified when the experimental data are combined with weak assumptions. The canonical finding is that the average treatment effect within some context-specific subpopulation is identified. The other branch specifies a population of a priori interest and seeks to learn about treatment effects in this population. Here the canonical finding is a bound on average treatment effects. The various approaches to the analysis of experiments are complementary from a mathematical perspective, but in tension as guides to evaluation practice. The reader of an evaluation reporting that some social program "works" or has "positive impact" should be careful to ascertain what treatment effect has been estimated and under what assumptions.
(This abstract was borrowed from another version of this item.)

Suggested Citation

  • Manski, C.F., 1995. "Learning About Social Programs from Experiments with Random Assignment of Treatments," Working papers 9505, Wisconsin Madison - Social Systems.
  • Handle: RePEc:att:wimass:9505
    as

    Download full text from publisher

    To our knowledge, this item is not available for download. To find whether it is available, there are three options:
    1. Check below whether another version of this item is available online.
    2. Check on the provider's web page whether it is in fact available.
    3. Perform a search for a similarly titled item that would be available.

    Other versions of this item:

    References listed on IDEAS

    as
    1. Nancy Clements & James Heckman & Jeffrey Smith, 1994. "Making the Most Out Of Social Experiments: Reducing the Intrinsic Uncertainty in Evidence from Randomized Trials with an Application to the JTPA Exp," NBER Technical Working Papers 0149, National Bureau of Economic Research, Inc.
    2. Manski, Charles F., 1992. "Identification Problems In The Social Sciences," SSRI Workshop Series 292716, University of Wisconsin-Madison, Social Systems Research Institute.
    3. Sims,Christopher A. (ed.), 1994. "Advances in Econometrics," Cambridge Books, Cambridge University Press, number 9780521444606.
    4. Manski, C.F., 1990. "The Selection Problem," Working papers 90-12, Wisconsin Madison - Social Systems.
    5. Frank Stafford, 1985. "Income-Maintenance Policy and Work Effort: Learning from Experiments and Labor-Market Studies," NBER Chapters, in: Social Experimentation, pages 95-144, National Bureau of Economic Research, Inc.
    6. James J. Heckman, 1991. "Randomization and Social Policy Evaluation Revisited," NBER Technical Working Papers 0107, National Bureau of Economic Research, Inc.
    7. Alicia H. Munnell, 1987. "Lessons from the income maintenance experiments: an overview," New England Economic Review, Federal Reserve Bank of Boston, issue May, pages 32-45.
    8. Sims,Christopher A. (ed.), 1994. "Advances in Econometrics," Cambridge Books, Cambridge University Press, number 9780521444590.
    9. Jeffrey E. Harris, 1985. "Macroexperiments versus Microexperiments for Health Policy," NBER Chapters, in: Social Experimentation, pages 145-186, National Bureau of Economic Research, Inc.
    10. Heckman, James J. & Robb, Richard Jr., 1985. "Alternative methods for evaluating the impact of interventions : An overview," Journal of Econometrics, Elsevier, vol. 30(1-2), pages 239-267.
    11. Charles F. Manski, 1997. "The Mixing Problem in Programme Evaluation," Review of Economic Studies, Oxford University Press, vol. 64(4), pages 537-553.
    12. V. Joseph Hotz & Seth Sanders, "undated". "Bounding Treatment Effects in Controlled and Natural Experiments Subject to Post-Randomization Treatment Choice," University of Chicago - Population Research Center 94-2, Chicago - Population Research Center.
    13. Aaron, Henry J, 1989. "Politics and the Professors Revisited," American Economic Review, American Economic Association, vol. 79(2), pages 1-15, May.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Meyer, Bruce D, 1995. "Natural and Quasi-experiments in Economics," Journal of Business & Economic Statistics, American Statistical Association, vol. 13(2), pages 151-161, April.
    2. Daniel Friedlander & Philip K. Robins, 1997. "The Distributional Impacts of Social Programs," Evaluation Review, , vol. 21(5), pages 531-553, October.
    3. Levitt, Steven D. & List, John A., 2009. "Field experiments in economics: The past, the present, and the future," European Economic Review, Elsevier, vol. 53(1), pages 1-18, January.
    4. Mark Schreiner, 2001. "Evaluation and Microenterprise Programs," Development and Comp Systems 0108002, University Library of Munich, Germany, revised 27 Dec 2001.
    5. Peter Glick, 2005. "Scaling Up HIV Voluntary Counseling and Testing in Africa," Evaluation Review, , vol. 29(4), pages 331-357, August.
    6. Cristian Aedo, "undated". "The Impact of Training Policies in Latin America and the Caribbean: The Case of "Programa Joven"," ILADES-UAH Working Papers inv131, Universidad Alberto Hurtado/School of Economics and Business.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Charles F. Manski & John Newman & John V. Pepper, "undated". "Using Performance Standards to Evaluate Social Programs with Incomplete Outcome Data: General Issues and Application to a Higher Education Block Grant Program," IPR working papers 00-1, Institute for Policy Resarch at Northwestern University.
    2. John Fitzgerald & Peter Gottschalk & Robert Moffitt, 1998. "An Analysis of Sample Attrition in Panel Data: The Michigan Panel Study of Income Dynamics," Journal of Human Resources, University of Wisconsin Press, vol. 33(2), pages 251-299.
    3. Horowitz, Joel L. & Manski, Charles F., 1998. "Censoring of outcomes and regressors due to survey nonresponse: Identification and estimation using weights and imputations," Journal of Econometrics, Elsevier, vol. 84(1), pages 37-58, May.
    4. Claudio Lucifora & Dominique Meurs, 2006. "The Public Sector Pay Gap In France, Great Britain And Italy," Review of Income and Wealth, International Association for Research in Income and Wealth, vol. 52(1), pages 43-59, March.
    5. Markus Gangl & Thomas A. DiPrete, 2004. "Kausalanalyse durch Matchingverfahren," Discussion Papers of DIW Berlin 401, DIW Berlin, German Institute for Economic Research.
    6. Manski, Charles F., 2000. "Identification problems and decisions under ambiguity: Empirical analysis of treatment response and normative analysis of treatment choice," Journal of Econometrics, Elsevier, vol. 95(2), pages 415-442, April.
    7. Charles F. Manski & John V. Pepper, 2000. "Monotone Instrumental Variables, with an Application to the Returns to Schooling," Econometrica, Econometric Society, vol. 68(4), pages 997-1012, July.
    8. Charles F. Manski, 1999. "Statistical Treatment Rules for Heterogeneous Populations: With Application to Randomized Experiments," NBER Technical Working Papers 0242, National Bureau of Economic Research, Inc.
    9. Richard V. Burkhauser & Shuaizhang Feng & Stephen P. Jenkins, 2009. "Using The P90/P10 Index To Measure U.S. Inequality Trends With Current Population Survey Data: A View From Inside The Census Bureau Vaults," Review of Income and Wealth, International Association for Research in Income and Wealth, vol. 55(1), pages 166-185, March.
    10. Manski, Charles, 1994. "Simultaneity with Downward Sloping Demand," SFB 373 Discussion Papers 1994,29, Humboldt University of Berlin, Interdisciplinary Research Project 373: Quantification and Simulation of Economic Processes.
    11. Lechner, Michael & Vazquez-Alvarez, Rosalia, 2003. "The Effect of Disability on Labour Market Outcomes in Germany: Evidence from Matching," IZA Discussion Papers 967, Institute of Labor Economics (IZA).
    12. Claudia Olivetti & Barbara Petrongolo, 2008. "Unequal Pay or Unequal Employment? A Cross-Country Analysis of Gender Gaps," Journal of Labor Economics, University of Chicago Press, vol. 26(4), pages 621-654, October.
    13. James J. Heckman & Edward J. Vytlacil, 2000. "Instrumental Variables, Selection Models, and Tight Bounds on the Average Treatment Effect," NBER Technical Working Papers 0259, National Bureau of Economic Research, Inc.
    14. Richard Blundell & Amanda Gosling & Hidehiko Ichimura & Costas Meghir, 2007. "Changes in the Distribution of Male and Female Wages Accounting for Employment Composition Using Bounds," Econometrica, Econometric Society, vol. 75(2), pages 323-363, March.
    15. John V. Pepper, 1999. "What Do Welfare-to-Work Demonstrations Reveal to Welfare Reformers?," Virginia Economics Online Papers 317, University of Virginia, Department of Economics.
    16. Lewbel, Arthur, 2007. "Endogenous selection or treatment model estimation," Journal of Econometrics, Elsevier, vol. 141(2), pages 777-806, December.
    17. Michael Lechner & Blaise Melly, 2007. "Earnings Effects of Training Programs," University of St. Gallen Department of Economics working paper series 2007 2007-28, Department of Economics, University of St. Gallen.
    18. Charles F. Manski & John Newman & John V. Pepper, 2002. "Using Performance Standards to Evaluate Social Programs with Incomplete Outcome Data," Evaluation Review, , vol. 26(4), pages 355-381, August.
    19. Mingliang Li & Dale J. Poirier & Justin L. Tobias, 2004. "Do dropouts suffer from dropping out? Estimation and prediction of outcome gains in generalized selection models," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 19(2), pages 203-225.
    20. Mirko Draca & Stephen Machin, 2015. "Crime and Economic Incentives," Annual Review of Economics, Annual Reviews, vol. 7(1), pages 389-408, August.

    More about this item

    Keywords

    SOCIAL POLICY;

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:att:wimass:9505. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Ailsenne Sumwalt (email available below). General contact details of provider: .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.