IDEAS home Printed from https://ideas.repec.org/p/wop/jopovw/105.html
   My bibliography  Save this paper

What Do Welfare-to-Work Demonstrations Reveal to Welfare Reformers?

Author

Listed:
  • John V. Pepper

Abstract

Under the new welfare system, states must design and institute programs that both provide assistance and encourage work, two objectives that have thus far appeared incompatible. Will states meet these new requirements? For many innovative programs, the randomized welfare-to-work experiments conducted over the last three decades may be the only source of observed data. While these experiments yield information on the outcomes of mandated treatments, the new regime permits states and localities much discretion. Using data from four experiments conducted in the mid-1980s, this study examines what welfare-to-work demonstrations reveal about outcomes when the treatments are heterogenous. In the absence of assumptions, these data allow us to draw only limited inferences about the labor market outcomes of welfare recipients. Combined with prior information, however, data from experimental demonstrations are informative, suggesting either that the long run federal requirements cannot be met or that these standards will only be met under special circumstances.

Suggested Citation

  • John V. Pepper, 1999. "What Do Welfare-to-Work Demonstrations Reveal to Welfare Reformers?," JCPR Working Papers 105, Northwestern University/University of Chicago Joint Center for Poverty Research.
  • Handle: RePEc:wop:jopovw:105
    as

    Download full text from publisher

    To our knowledge, this item is not available for download. To find whether it is available, there are three options:
    1. Check below whether another version of this item is available online.
    2. Check on the provider's web page whether it is in fact available.
    3. Perform a search for a similarly titled item that would be available.

    Other versions of this item:

    References listed on IDEAS

    as
    1. Manski, Charles F., 1992. "Identification Problems In The Social Sciences," SSRI Workshop Series 292716, University of Wisconsin-Madison, Social Systems Research Institute.
    2. James J. Heckman, 1991. "Randomization and Social Policy Evaluation Revisited," NBER Technical Working Papers 0107, National Bureau of Economic Research, Inc.
    3. Manski, C.F. & Nagin, D.S., 1995. "Bounding Disagreements About Treatment Effects: A Case Study of Sentencing and Recidivism," Working papers 9526, Wisconsin Madison - Social Systems.
    4. John V. Pepper, 2000. "The Intergenerational Transmission Of Welfare Receipt: A Nonparametric Bounds Analysis," The Review of Economics and Statistics, MIT Press, vol. 82(3), pages 472-488, August.
    5. Dehejia, Rajeev H., 2005. "Program evaluation as a decision problem," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 141-173.
    6. Charles F. Manski, 1996. "Learning about Treatment Effects from Experiments with Random Assignment of Treatments," Journal of Human Resources, University of Wisconsin Press, vol. 31(4), pages 709-733.
    7. V. Joseph Hotz & Guido W. Imbens & Julie H. Mortimer, 1999. "Predicting the Efficacy of Future Training Programs Using Past Experiences," NBER Technical Working Papers 0238, National Bureau of Economic Research, Inc.
    8. Charles F. Manski, 1997. "The Mixing Problem in Programme Evaluation," Review of Economic Studies, Oxford University Press, vol. 64(4), pages 537-553.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Charles F. Manski & John Newman & John V. Pepper, "undated". "Using Performance Standards to Evaluate Social Programs with Incomplete Outcome Data: General Issues and Application to a Higher Education Block Grant Program," IPR working papers 00-1, Institute for Policy Resarch at Northwestern University.
    2. Oscar Mitnik, 2008. "How do Training Programs Assign Participants to Training? Characterizing the Assignment Rules of Government Agencies for Welfare-to-Work Programs in California," Working Papers 0907, University of Miami, Department of Economics.
    3. Guildo W. Imbens, 2003. "Sensitivity to Exogeneity Assumptions in Program Evaluation," American Economic Review, American Economic Association, vol. 93(2), pages 126-132, May.
    4. Robert Lemke & Claus Hoerandner & Robert McMahon, 2006. "Student Assessments, Non-test-takers, and School Accountability," Education Economics, Taylor & Francis Journals, vol. 14(2), pages 235-250.
    5. Jeounghee Kim, 2012. "The Effects of Welfare-to-Work Programs on Welfare Recipients’ Employment Outcomes," Journal of Family and Economic Issues, Springer, vol. 33(1), pages 130-142, March.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Charles F. Manski, 1999. "Statistical Treatment Rules for Heterogeneous Populations: With Application to Randomized Experiments," NBER Technical Working Papers 0242, National Bureau of Economic Research, Inc.
    2. Libertad González, 2005. "Nonparametric bounds on the returns to language skills," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 20(6), pages 771-795.
    3. Guido W. Imbens, 2010. "Better LATE Than Nothing: Some Comments on Deaton (2009) and Heckman and Urzua (2009)," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 399-423, June.
    4. Charles F. Manski, 2000. "Using Studies of Treatment Response to Inform Treatment Choice in Heterogeneous Populations," NBER Technical Working Papers 0263, National Bureau of Economic Research, Inc.
    5. Craig Gundersen & Brent Kreider & John Pepper & Valerie Tarasuk, 2017. "Food assistance programs and food insecurity: implications for Canada in light of the mixing problem," Empirical Economics, Springer, vol. 52(3), pages 1065-1087, May.
    6. C. F. Manski, "undated". "Learning about social programs from experiments with random assignment of treatments," Institute for Research on Poverty Discussion Papers 1061-95, University of Wisconsin Institute for Research on Poverty.
    7. Alberto Abadie & Guido W. Imbens, 2002. "Simple and Bias-Corrected Matching Estimators for Average Treatment Effects," NBER Technical Working Papers 0283, National Bureau of Economic Research, Inc.
    8. Vishal Kamat, 2017. "Identifying the Effects of a Program Offer with an Application to Head Start," Papers 1711.02048, arXiv.org, revised Aug 2023.
    9. Timothy B. Armstrong & Shu Shen, 2013. "Inference on Optimal Treatment Assignments," Cowles Foundation Discussion Papers 1927RR, Cowles Foundation for Research in Economics, Yale University, revised Apr 2015.
    10. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    11. Susan Watkins & Ina Warriner, 2003. "How do we know we need to control for selectivity?," Demographic Research Special Collections, Max Planck Institute for Demographic Research, Rostock, Germany, vol. 1(4), pages 109-142.
    12. John V. Pepper, 2003. "Using Experiments to Evaluate Performance Standards: What Do Welfare-to-Work Demonstrations Reveal to Welfare Reformers?," Journal of Human Resources, University of Wisconsin Press, vol. 38(4).
    13. Jere R. Behrman & Hans-Peter Kohler & Susan Cotts Watkins, 2001. "How can we measure the causal effects of social networks using observational data? Evidence from the diffusion of family planning and AIDS worries in South Nyanza District, Kenya," MPIDR Working Papers WP-2001-022, Max Planck Institute for Demographic Research, Rostock, Germany.
    14. Timothy Christensen & Hyungsik Roger Moon & Frank Schorfheide, 2020. "Robust Forecasting," Papers 2011.03153, arXiv.org, revised Dec 2020.
    15. Rajeev Dehejia, 2000. "Was There a Riverside Miracle? A Framework for Evaluating Multi-Site Programs," NBER Working Papers 7844, National Bureau of Economic Research, Inc.
    16. Markus Gangl & Thomas A. DiPrete, 2004. "Kausalanalyse durch Matchingverfahren," Discussion Papers of DIW Berlin 401, DIW Berlin, German Institute for Economic Research.
    17. Charles F. Manski & John Newman & John V. Pepper, "undated". "Using Performance Standards to Evaluate Social Programs with Incomplete Outcome Data: General Issues and Application to a Higher Education Block Grant Program," IPR working papers 00-1, Institute for Policy Resarch at Northwestern University.
    18. Charles F. Manski & John Newman & John V. Pepper, 2002. "Using Performance Standards to Evaluate Social Programs with Incomplete Outcome Data," Evaluation Review, , vol. 26(4), pages 355-381, August.
    19. Rubén Hernández-Murillo & John Knowles, 2004. "Racial Profiling Or Racist Policing? Bounds Tests In Aggregate Data," International Economic Review, Department of Economics, University of Pennsylvania and Osaka University Institute of Social and Economic Research Association, vol. 45(3), pages 959-989, August.
    20. Dehejia, Rajeev H., 2005. "Program evaluation as a decision problem," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 141-173.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:wop:jopovw:105. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Thomas Krichel (email available below). General contact details of provider: https://edirc.repec.org/data/jcuchus.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.