IDEAS home Printed from https://ideas.repec.org/p/wop/jopovw/10.html
   My bibliography  Save this paper

Using Administrative Data to Evaluate the Ohio JOBS Student Retention Program

Author

Listed:
  • Kevin Hollenbeck
  • Jean Kimmel
  • Randall W. Eberts

Abstract

This paper presents findings from a net impact evaluation of the Ohio JOBS Student Retention Program. The JOBS program, a component of the federal Aid to Families with Dependent Children (ADC) program, was required, in all states, for ADC recipients who met certain criteria. The Ohio JSRP was an activity pursued by some JOBS program clients in Ohio to fulfill their responsibilities in order to receive aid. The JSRP was a three-fold support program designed to facilitate entry to and success in programs of study at two-year community or technical colleges. We evaluated this state welfare policy while simultaneously dealing with methodological issues associated with the use of the different state administrative data sets. Community colleges are natural partners in states' attempts to help welfare recipients in their transitions from public assistance to work. Historically, two-year colleges have served older and disadvantaged students, and so they have a tradition of providing the sort of individualized attention to support successfully welfare recipients through to degree completion. Approximately 17,000 individuals had participated in the Ohio JSRP program between its inception in 1990 and summer 1995, the time period for this study. While its inception pre-dates the current new welfare environment, lessons learned from this program serve to inform the ongoing policy debate. The focus of this paper, is an evaluation using state administrative data. The empirical evaluation relied on matching state administrative data from three sources: JSRP program participation data collected by the individual community colleges and managed by the state, Ohio Department of Human Resources CRIS-E (welfare) data, and many quarters of the Ohio Bureau of Employment Services wage-record data. There were three major problems with using these data. First there was incomplete information concerning program and degree completion. But we could analyze program participation and link to employment outcomes. Second there were difficulties matching across data sources. For example, not all individuals in the JSRP files were located in the CRIS-E files. Third, there was no random assignment with a true control group. We handled this final problem by constructing a comparison group pulled from the CRIS-E files for those individuals in higher education but not participating in JSRP. We were able to merge data across these three sources without any confidentiality problems. Our net impact analysis relied on an unadjusted comparison of means and a regression-adjusted comparison of means for the JSRP group and the constructed comparison group. The comparison group was comprised of JOBS clients in the ODHS CRIS-E file with twelve or more years of schooling who were assigned to higher education as their JOBS component. To assure as much consistency with the JSRP group as possible, all those in the latter group reporting fewer than twelve years of education were excluded from this portion of the empirical analyses. Outcomes included employment, earnings, and welfare recipiency. Two definitions of JSRP participation were included: one indicating any participation and one for program completion. JSRP appeared to boost earnings. Focusing on the most recent 11 of the 16 total quarters of data available, the average boost to earnings across quarters accruing to program participation was 8.45 percent. Focusing just on program completion yielded an estimated boost to earnings of 12.91 percent. While there are some imperfections with this empirical work due to the approximate nature of the comparison group, the results indicate that encouraging postsecondary education for a subset of welfare recipients might help to boost earnings capacity and therefore long-term self-sufficiency. Also, the paper shows the benefit of using readily available state administrative data to evaluate policy.

Suggested Citation

  • Kevin Hollenbeck & Jean Kimmel & Randall W. Eberts, 1997. "Using Administrative Data to Evaluate the Ohio JOBS Student Retention Program," JCPR Working Papers 10, Northwestern University/University of Chicago Joint Center for Poverty Research.
  • Handle: RePEc:wop:jopovw:10
    as

    Download full text from publisher

    To our knowledge, this item is not available for download. To find whether it is available, there are three options:
    1. Check below whether another version of this item is available online.
    2. Check on the provider's web page whether it is in fact available.
    3. Perform a search for a similarly titled item that would be available.

    Other versions of this item:

    References listed on IDEAS

    as
    1. Rajeev H. Dehejia & Sadek Wahba, 1998. "Causal Effects in Non-Experimental Studies: Re-Evaluating the Evaluation of Training Programs," NBER Working Papers 6586, National Bureau of Economic Research, Inc.
    2. Heckman, J.J. & Hotz, V.J., 1988. "Choosing Among Alternative Nonexperimental Methods For Estimating The Impact Of Social Programs: The Case Of Manpower Training," University of Chicago - Economics Research Center 88-12, Chicago - Economics Research Center.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Jeffrey Smith, 2000. "A Critical Survey of Empirical Methods for Evaluating Active Labor Market Policies," Swiss Journal of Economics and Statistics (SJES), Swiss Society of Economics and Statistics (SSES), vol. 136(III), pages 247-268, September.
    2. Hämäläinen, Kari & Ollikainen, Virve, 2004. "Differential Effects of Active Labour Market Programmes in the Early Stages of Young People's Unemployment," Research Reports 115, VATT Institute for Economic Research.
    3. A. Smith, Jeffrey & E. Todd, Petra, 2005. "Does matching overcome LaLonde's critique of nonexperimental estimators?," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 305-353.
    4. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    5. Eliasson, Kent, 2006. "The Role of Ability in Estimating the Returns to College Choice: New Swedish Evidence," Umeå Economic Studies 691, Umeå University, Department of Economics.
    6. Sergio Firpo, 2007. "Efficient Semiparametric Estimation of Quantile Treatment Effects," Econometrica, Econometric Society, vol. 75(1), pages 259-276, January.
    7. V. Joseph Hotz & Guido W. Imbens & Jacob A. Klerman, 2000. "The Long-Term Gains from GAIN: A Re-Analysis of the Impacts of the California GAIN Program," NBER Working Papers 8007, National Bureau of Economic Research, Inc.
    8. James Heckman & Salvador Navarro-Lozano, 2004. "Using Matching, Instrumental Variables, and Control Functions to Estimate Economic Choice Models," The Review of Economics and Statistics, MIT Press, vol. 86(1), pages 30-57, February.
    9. Daniele Bondonio, 2003. "Do Tax Incentives Affect Local Economic Growth? What Mean Impacts Miss in the Analysis of Enterprise Zone Policies," Working Papers 03-17, Center for Economic Studies, U.S. Census Bureau.
    10. Peter R. Mueser & Kenneth R. Troske & Alexey Gorislavsky, 2007. "Using State Administrative Data to Measure Program Performance," The Review of Economics and Statistics, MIT Press, vol. 89(4), pages 761-783, November.
    11. V. Joseph Hotz & Guido W. Imbens & Jacob A. Klerman, 2006. "Evaluating the Differential Effects of Alternative Welfare-to-Work Training Components: A Reanalysis of the California GAIN Program," Journal of Labor Economics, University of Chicago Press, vol. 24(3), pages 521-566, July.
    12. James J. Heckman & Petra E. Todd, 2009. "A note on adapting propensity score matching and selection models to choice based samples," Econometrics Journal, Royal Economic Society, vol. 12(s1), pages 230-234, January.
    13. Elías, Víctor & Ruiz Núñez, Fernanda & Cossa, Ricardo & Bravo, David, 2004. "An Econometric Cost-Benefit Analysis of Argentina’s Youth Training Program," IDB Publications (Working Papers) 13053, Inter-American Development Bank.
    14. Bart, COCKX & Jean, RIES, 2004. "The Exhaustion of Unemployment Benefits in Belgium. Does it Enhance the Probability of Employment ?," LIDAM Discussion Papers IRES 2004016, Université catholique de Louvain, Institut de Recherches Economiques et Sociales (IRES).
    15. Steven Lehrer & Gregory Kordas, 2013. "Matching using semiparametric propensity scores," Empirical Economics, Springer, vol. 44(1), pages 13-45, February.
    16. Fougère, Denis & Crépon, Bruno & Brodaty, Thomas, 2000. "Using Matching Estimators to Evaluate Alternative Youth Employment Programs: Evidence from France, 1986-1988," CEPR Discussion Papers 2604, C.E.P.R. Discussion Papers.
    17. Bondonio, Daniele & Engberg, John, 2000. "Enterprise zones and local employment: evidence from the states' programs," Regional Science and Urban Economics, Elsevier, vol. 30(5), pages 519-549, September.
    18. Zhao, Zhong, 2008. "Sensitivity of propensity score methods to the specifications," Economics Letters, Elsevier, vol. 98(3), pages 309-319, March.
    19. Hujer, Reinhard & Wellner, Marc, 2000. "The Effects of Public Sector Sponsored Training on Individual Employment Performance in East Germany," IZA Discussion Papers 141, Institute of Labor Economics (IZA).
    20. Michael Lechner, 2004. "Sequential Matching Estimation of Dynamic Causal Models," University of St. Gallen Department of Economics working paper series 2004 2004-06, Department of Economics, University of St. Gallen.

    More about this item

    JEL classification:

    • J0 - Labor and Demographic Economics - - General
    • J2 - Labor and Demographic Economics - - Demand and Supply of Labor

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:wop:jopovw:10. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Thomas Krichel (email available below). General contact details of provider: https://edirc.repec.org/data/jcuchus.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.