IDEAS home Printed from https://ideas.repec.org/p/mpr/mprres/f8760335b9ab4a39bdf2c3533cac61e8.html
   My bibliography  Save this paper

State Partnership Initiative: Selection of Comparison Groups for the Evaluation and Selected Impact Estimates

Author

Listed:
  • Deborah Peikes
  • Sean Orzol
  • Lorenzo Moreno
  • Nora Paxton

Abstract

This report, focusing on employment and earnings, describes how nonexperimental comparison groups were selected for each of the 11 state projects that targeted adult beneficiaries, the validity of the comparison groups, and short-term project effects on employment and earnings in 3 state projects that used randomized designs. The report finds promising results on employment, but negative or no effect on earnings.

Suggested Citation

  • Deborah Peikes & Sean Orzol & Lorenzo Moreno & Nora Paxton, "undated". "State Partnership Initiative: Selection of Comparison Groups for the Evaluation and Selected Impact Estimates," Mathematica Policy Research Reports f8760335b9ab4a39bdf2c3533, Mathematica Policy Research.
  • Handle: RePEc:mpr:mprres:f8760335b9ab4a39bdf2c3533cac61e8
    as

    Download full text from publisher

    File URL: https://www.mathematica.org/-/media/publications/pdfs/spiselectimpact.pdf
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. LaLonde, Robert J, 1986. "Evaluating the Econometric Evaluations of Training Programs with Experimental Data," American Economic Review, American Economic Association, vol. 76(4), pages 604-620, September.
    2. A. Smith, Jeffrey & E. Todd, Petra, 2005. "Does matching overcome LaLonde's critique of nonexperimental estimators?," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 305-353.
    3. Fraker, Thomas & Moffitt, Robert, 1988. "The effect of food stamps on labor supply : A bivariate selection model," Journal of Public Economics, Elsevier, vol. 35(1), pages 25-56, February.
    4. Daron Acemoglu & Joshua D. Angrist, 2001. "Consequences of Employment Protection? The Case of the Americans with Disabilities Act," Journal of Political Economy, University of Chicago Press, vol. 109(5), pages 915-957, October.
    5. repec:mpr:mprres:4591 is not listed on IDEAS
    6. Steven Glazerman & Dan M. Levy & David Myers, 2003. "Nonexperimental Versus Experimental Estimates of Earnings Impacts," The ANNALS of the American Academy of Political and Social Science, , vol. 589(1), pages 63-93, September.
    7. Ashenfelter, Orley C, 1978. "Estimating the Effect of Training Programs on Earnings," The Review of Economics and Statistics, MIT Press, vol. 60(1), pages 47-57, February.
    8. Roberto Agodini & Mark Dynarski, 2004. "Are Experiments the Only Option? A Look at Dropout Prevention Programs," The Review of Economics and Statistics, MIT Press, vol. 86(1), pages 180-194, February.
    9. repec:mpr:mprres:3790 is not listed on IDEAS
    10. Deborah Peikes & Nora Paxton, "undated". "State Partnership Initiative: Characteristics of Participants Enrolled Through March 2003," Mathematica Policy Research Reports 1bdfedde1f464b80bd96added, Mathematica Policy Research.
    11. Roberto Agodini & Craig Thornton & Nazmul Khan & Deborah Peikes, "undated". "Design for Estimating the Net Outcomes of the State Partnership Initiative," Mathematica Policy Research Reports 86db15d121f44173973b8a648, Mathematica Policy Research.
    12. repec:mpr:mprres:3851 is not listed on IDEAS
    13. repec:mpr:mprres:3694 is not listed on IDEAS
    14. White, Halbert, 1980. "A Heteroskedasticity-Consistent Covariance Matrix Estimator and a Direct Test for Heteroskedasticity," Econometrica, Econometric Society, vol. 48(4), pages 817-838, May.
    15. repec:mpr:mprres:3011 is not listed on IDEAS
    16. Justin S. White & William E. Black & Henry T. Ireys, "undated". "Explaining Enrollment Trends and Participant Characteristics of the Medicaid Buy-In Program, 2002-2003," Mathematica Policy Research Reports 40a07d31c5f74465ad331bd0f, Mathematica Policy Research.
    17. repec:mpr:mprres:794 is not listed on IDEAS
    18. Ashenfelter, Orley, 1987. "The case for evaluating training programs with randomized trials," Economics of Education Review, Elsevier, vol. 6(4), pages 333-338, August.
    19. repec:mpr:mprres:4379 is not listed on IDEAS
    20. Henry T. Ireys & Justin S. White & Craig Thornton, "undated". "The Medicaid Buy-In Program: Quantitative Measures of Enrollment Trends and Participant Characteristics in 2002," Mathematica Policy Research Reports 7152c36db4da43a39c1fb840d, Mathematica Policy Research.
    21. Thomas Fraker & Rebecca Maynard, 1987. "The Adequacy of Comparison Group Designs for Evaluations of Employment-Related Programs," Journal of Human Resources, University of Wisconsin Press, vol. 22(2), pages 194-227.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. repec:mpr:mprres:7829 is not listed on IDEAS
    2. repec:mpr:mprres:6217 is not listed on IDEAS
    3. repec:mpr:mprres:7005 is not listed on IDEAS
    4. repec:mpr:mprres:6523 is not listed on IDEAS

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. repec:mpr:mprres:4778 is not listed on IDEAS
    2. Katherine Baicker & Theodore Svoronos, 2019. "Testing the Validity of the Single Interrupted Time Series Design," NBER Working Papers 26080, National Bureau of Economic Research, Inc.
    3. Kenneth Fortson & Natalya Verbitsky-Savitz & Emma Kopa & Philip Gleason, 2012. "Using an Experimental Evaluation of Charter Schools to Test Whether Nonexperimental Comparison Group Methods Can Replicate Experimental Impact Estimates," Mathematica Policy Research Reports 27f871b5b7b94f3a80278a593, Mathematica Policy Research.
    4. Deborah Peikes & Ankur Sarin, "undated". "State Partnership Initiative: Synthesis of Impact Estimates Generated by the State Projects' Evaluations," Mathematica Policy Research Reports 4d233fa32c774c76bfd6d26ef, Mathematica Policy Research.
    5. Kenneth Fortson & Philip Gleason & Emma Kopa & Natalya Verbitsky-Savitz, "undated". "Horseshoes, Hand Grenades, and Treatment Effects? Reassessing Bias in Nonexperimental Estimators," Mathematica Policy Research Reports 1c24988cd5454dd3be51fbc2c, Mathematica Policy Research.
    6. Fortson, Kenneth & Gleason, Philip & Kopa, Emma & Verbitsky-Savitz, Natalya, 2015. "Horseshoes, hand grenades, and treatment effects? Reassessing whether nonexperimental estimators are biased," Economics of Education Review, Elsevier, vol. 44(C), pages 100-113.
    7. Katherine Baicker & Theodore Svoronos, 2019. "Testing the Validity of the Single Interrupted Time Series Design," CID Working Papers 364, Center for International Development at Harvard University.
    8. Andrew P. Jaciw, 2016. "Applications of a Within-Study Comparison Approach for Evaluating Bias in Generalized Causal Inferences From Comparison Groups Studies," Evaluation Review, , vol. 40(3), pages 241-276, June.
    9. Vivian C. Wong & Peter M. Steiner & Kylie L. Anglin, 2018. "What Can Be Learned From Empirical Evaluations of Nonexperimental Methods?," Evaluation Review, , vol. 42(2), pages 147-175, April.
    10. Andrew P. Jaciw, 2016. "Assessing the Accuracy of Generalized Inferences From Comparison Group Studies Using a Within-Study Comparison Approach," Evaluation Review, , vol. 40(3), pages 199-240, June.
    11. Joshua D. Angrist & Jörn-Steffen Pischke, 2010. "The Credibility Revolution in Empirical Economics: How Better Research Design Is Taking the Con out of Econometrics," Journal of Economic Perspectives, American Economic Association, vol. 24(2), pages 3-30, Spring.
    12. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    13. Robin Jacob & Marie-Andree Somers & Pei Zhu & Howard Bloom, 2016. "The Validity of the Comparative Interrupted Time Series Design for Evaluating the Effect of School-Level Interventions," Evaluation Review, , vol. 40(3), pages 167-198, June.
    14. Peter R. Mueser & Kenneth R. Troske & Alexey Gorislavsky, 2007. "Using State Administrative Data to Measure Program Performance," The Review of Economics and Statistics, MIT Press, vol. 89(4), pages 761-783, November.
    15. Ferraro, Paul J. & Miranda, Juan José, 2014. "The performance of non-experimental designs in the evaluation of environmental programs: A design-replication study using a large-scale randomized experiment as a benchmark," Journal of Economic Behavior & Organization, Elsevier, vol. 107(PA), pages 344-365.
    16. Anders Stenberg & Olle Westerlund, 2015. "The long-term earnings consequences of general vs. specific training of the unemployed," IZA Journal of European Labor Studies, Springer;Forschungsinstitut zur Zukunft der Arbeit GmbH (IZA), vol. 4(1), pages 1-26, December.
    17. David Card & Jochen Kluve & Andrea Weber, 2018. "What Works? A Meta Analysis of Recent Active Labor Market Program Evaluations," Journal of the European Economic Association, European Economic Association, vol. 16(3), pages 894-931.
    18. Guido W. Imbens, 2010. "Better LATE Than Nothing: Some Comments on Deaton (2009) and Heckman and Urzua (2009)," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 399-423, June.
    19. Fatih Unlu & Douglas Lee Lauen & Sarah Crittenden Fuller & Tiffany Berglund & Elc Estrera, 2021. "Can Quasi‐Experimental Evaluations That Rely On State Longitudinal Data Systems Replicate Experimental Results?," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 40(2), pages 572-613, March.
    20. Elizabeth Ty Wilde & Robinson Hollister, 2007. "How close is close enough? Evaluating propensity score matching using data from a class size reduction experiment," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 26(3), pages 455-477.
    21. Henrik Hansen & Ninja Ritter Klejnstrup & Ole Winckler Andersen, 2011. "A Comparison of Model-based and Design-based Impact Evaluations of Interventions in Developing Countries," IFRO Working Paper 2011/16, University of Copenhagen, Department of Food and Resource Economics.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:mpr:mprres:f8760335b9ab4a39bdf2c3533cac61e8. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Joanne Pfleiderer or Cindy George (email available below). General contact details of provider: https://edirc.repec.org/data/mathius.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.