IDEAS home Printed from https://ideas.repec.org/a/sae/evarev/v31y2007i2p95-120.html
   My bibliography  Save this article

Using Propensity Scoring to Estimate Program-Related Subgroup Impacts in Experimental Program Evaluations

Author

Listed:
  • Peter Z. Schochet
  • John Burghardt

    (Mathematica Policy Research, Inc.)

Abstract

This article discusses the use of propensity scoring in experimental program evaluations to estimate impacts for subgroups defined by program features and participants' program experiences. The authors discuss estimation issues and provide specification tests. They also discuss the use of an overlooked data collection design—obtaining predictions that program intake staff make about applicants' likely program assignments and experiences—that could improve the quality of matched comparison samples. They demonstrate the effectiveness of this approach in producing credible subgroup findings using data from a large-scale experimental evaluation of Job Corps, the nation's largest federal education and training program for disadvantaged youths.

Suggested Citation

  • Peter Z. Schochet & John Burghardt, 2007. "Using Propensity Scoring to Estimate Program-Related Subgroup Impacts in Experimental Program Evaluations," Evaluation Review, , vol. 31(2), pages 95-120, April.
  • Handle: RePEc:sae:evarev:v:31:y:2007:i:2:p:95-120
    DOI: 10.1177/0193841X06288736
    as

    Download full text from publisher

    File URL: https://journals.sagepub.com/doi/10.1177/0193841X06288736
    Download Restriction: no

    File URL: https://libkey.io/10.1177/0193841X06288736?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. LaLonde, Robert J, 1986. "Evaluating the Econometric Evaluations of Training Programs with Experimental Data," American Economic Review, American Economic Association, vol. 76(4), pages 604-620, September.
    2. Peter Z. Schochet & John Burghardt & Steven Glazerman, 2001. "National Job Corps Study: The Impacts of Job Corps on Participants' Employment and Related Outcomes," Mathematica Policy Research Reports db6c4204b8e1408bb0c6289ec, Mathematica Policy Research.
    3. James J. Heckman & Hidehiko Ichimura & Petra E. Todd, 1997. "Matching As An Econometric Evaluation Estimator: Evidence from Evaluating a Job Training Programme," Review of Economic Studies, Oxford University Press, vol. 64(4), pages 605-654.
    4. repec:mpr:mprres:2950 is not listed on IDEAS
    5. repec:mpr:mprres:1966 is not listed on IDEAS
    6. John Burghardt & Peter Z. Schochet, 2001. "National Job Corps Study: Impacts by Center Characteristics," Mathematica Policy Research Reports 79dc6705a88648e3881115389, Mathematica Policy Research.
    7. Terry Johnson & Mark Gritz & Russell Jackson & John Burghardt & Carol Boussy & Jan Leonard & Carlyn Orians, 1999. "National Job Corps Study: Report on the Process Analysis," Mathematica Policy Research Reports efc0cd05f0524a049779f797f, Mathematica Policy Research.
    8. James Heckman & Hidehiko Ichimura & Jeffrey Smith & Petra Todd, 1998. "Characterizing Selection Bias Using Experimental Data," Econometrica, Econometric Society, vol. 66(5), pages 1017-1098, September.
    9. Joseph L. Gastwirth & Abba M. Krieger & Paul R. Rosenbaum, 2000. "Asymptotic separability in sensitivity analysis," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 62(3), pages 545-555.
    10. repec:mpr:mprres:2956 is not listed on IDEAS
    11. repec:mpr:mprres:1968 is not listed on IDEAS
    12. Sheena McConnell & Steven Glazerman, 2001. "National Job Corps Study: The Benefits and Costs of Job Corps," Mathematica Policy Research Reports 19ff8678a108410587c5dfad0, Mathematica Policy Research.
    13. Roberto Agodini & Mark Dynarski, "undated". "Are Experiments the Only Option? A Look at Dropout Prevention Programs," Mathematica Policy Research Reports 51241adbf9fa4a26add6d54c5, Mathematica Policy Research.
    14. Roberto Agodini & Mark Dynarski, 2004. "Are Experiments the Only Option? A Look at Dropout Prevention Programs," The Review of Economics and Statistics, MIT Press, vol. 86(1), pages 180-194, February.
    15. Peter Z. Schochet, 2001. "National Job Corps Study: Methodological Appendixes on the Impact Analysis," Mathematica Policy Research Reports c3abb7b819cd4bc5a09a865d6, Mathematica Policy Research.
    16. James J. Heckman & Hidehiko Ichimura & Petra Todd, 1998. "Matching As An Econometric Evaluation Estimator," Review of Economic Studies, Oxford University Press, vol. 65(2), pages 261-294.
    17. repec:mpr:mprres:2949 is not listed on IDEAS
    18. repec:mpr:mprres:3604 is not listed on IDEAS
    19. repec:mpr:mprres:2951 is not listed on IDEAS
    20. repec:mpr:mprres:2955 is not listed on IDEAS
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Charles Meth, 2011. "Employer of Last Resort? South Africa’s Expanded Public Works Programme (EPWP)," SALDRU Working Papers 58, Southern Africa Labour and Development Research Unit, University of Cape Town.
    2. Quinn Moore & Sheena McConnell & Alan Werner & Tim Kautz & Kristen Joyce & Kelley Borradaile & Bethany Boland, "undated". "Evaluation of Employment Coaching for TANF and Related Populations: Evaluation Design Report," Mathematica Policy Research Reports 3f5e6ca2b92549d1823c3bbe8, Mathematica Policy Research.
    3. Green, Beth L. & Sanders, Mary Beth & Tarte, Jerod, 2017. "Using administrative data to evaluate the effectiveness of the Healthy Families Oregon home visiting program: 2-year impacts on child maltreatment & service utilization," Children and Youth Services Review, Elsevier, vol. 75(C), pages 77-86.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Ferraro, Paul J. & Miranda, Juan José, 2014. "The performance of non-experimental designs in the evaluation of environmental programs: A design-replication study using a large-scale randomized experiment as a benchmark," Journal of Economic Behavior & Organization, Elsevier, vol. 107(PA), pages 344-365.
    2. Fatih Unlu & Douglas Lee Lauen & Sarah Crittenden Fuller & Tiffany Berglund & Elc Estrera, 2021. "Can Quasi‐Experimental Evaluations That Rely On State Longitudinal Data Systems Replicate Experimental Results?," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 40(2), pages 572-613, March.
    3. Kristen Harknett, 2006. "Does Receiving an Earnings Supplement Affect Union Formation? Estimating Effects for Program Participants Using Propensity Score Matching," Evaluation Review, , vol. 30(6), pages 741-778, December.
    4. Sudhanshu Handa & John A. Maluccio, 2010. "Matching the Gold Standard: Comparing Experimental and Nonexperimental Evaluation Techniques for a Geographically Targeted Program," Economic Development and Cultural Change, University of Chicago Press, vol. 58(3), pages 415-447, April.
    5. Ravallion, Martin, 2008. "Evaluating Anti-Poverty Programs," Handbook of Development Economics, in: T. Paul Schultz & John A. Strauss (ed.), Handbook of Development Economics, edition 1, volume 4, chapter 59, pages 3787-3846, Elsevier.
    6. Justine Burns & Malcolm Kewsell & Rebecca Thornton, 2009. "Evaluating the Impact of Health Programmes," SALDRU Working Papers 40, Southern Africa Labour and Development Research Unit, University of Cape Town.
    7. Vivian C. Wong & Peter M. Steiner & Kylie L. Anglin, 2018. "What Can Be Learned From Empirical Evaluations of Nonexperimental Methods?," Evaluation Review, , vol. 42(2), pages 147-175, April.
    8. Andrew P. Jaciw, 2016. "Assessing the Accuracy of Generalized Inferences From Comparison Group Studies Using a Within-Study Comparison Approach," Evaluation Review, , vol. 40(3), pages 199-240, June.
    9. Dettmann, E. & Becker, C. & Schmeißer, C., 2011. "Distance functions for matching in small samples," Computational Statistics & Data Analysis, Elsevier, vol. 55(5), pages 1942-1960, May.
    10. Burt S. Barnow & Jeffrey Smith, 2015. "Employment and Training Programs," NBER Chapters, in: Economics of Means-Tested Transfer Programs in the United States, Volume 2, pages 127-234, National Bureau of Economic Research, Inc.
    11. Alberto Abadie & Guido W. Imbens, 2002. "Simple and Bias-Corrected Matching Estimators for Average Treatment Effects," NBER Technical Working Papers 0283, National Bureau of Economic Research, Inc.
    12. Juan Díaz & Miguel Jaramillo, 2006. "An Evaluation of the Peruvian "Youth Labor Training Program"-PROJOVEN," OVE Working Papers 1006, Inter-American Development Bank, Office of Evaluation and Oversight (OVE).
    13. Richard K. Crump & V. Joseph Hotz & Guido W. Imbens & Oscar A. Mitnik, 2006. "Moving the Goalposts: Addressing Limited Overlap in the Estimation of Average Treatment Effects by Changing the Estimand," NBER Technical Working Papers 0330, National Bureau of Economic Research, Inc.
    14. Richard K. Crump & V. Joseph Hotz & Guido W. Imbens & Oscar A. Mitnik, 2009. "Dealing with limited overlap in estimation of average treatment effects," Biometrika, Biometrika Trust, vol. 96(1), pages 187-199.
    15. Denis Conniffe & Vanessa Gash & Philip J. O'Connell, 2000. "Evaluating State Programmes - “Natural Experiments” and Propensity Scores," The Economic and Social Review, Economic and Social Studies, vol. 31(4), pages 283-308.
    16. A. Smith, Jeffrey & E. Todd, Petra, 2005. "Does matching overcome LaLonde's critique of nonexperimental estimators?," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 305-353.
    17. Dettmann, Eva & Becker, Claudia & Schmeißer, Christian, 2010. "Is there a Superior Distance Function for Matching in Small Samples?," IWH Discussion Papers 3/2010, Halle Institute for Economic Research (IWH).
    18. Carlos A. Flores & Oscar A. Mitnik, 2009. "Evaluating Nonexperimental Estimators for Multiple Treatments: Evidence from Experimental Data," Working Papers 2010-10, University of Miami, Department of Economics.
    19. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    20. V. Joseph Hotz & Guido W. Imbens & Jacob A. Klerman, 2000. "The Long-Term Gains from GAIN: A Re-Analysis of the Impacts of the California GAIN Program," NBER Working Papers 8007, National Bureau of Economic Research, Inc.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:sae:evarev:v:31:y:2007:i:2:p:95-120. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: SAGE Publications (email available below). General contact details of provider: .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.