Using Administrative Data to Evaluate the Ohio JOBS Student Retention Program
This paper presents findings from a net impact evaluation of the Ohio JOBS Student Retention Program. The JOBS program, a component of the federal Aid to Families with Dependent Children (ADC) program, was required, in all states, for ADC recipients who met certain criteria. The Ohio JSRP was an activity pursued by some JOBS program clients in Ohio to fulfill their responsibilities in order to receive aid. The JSRP was a three-fold support program designed to facilitate entry to and success in programs of study at two-year community or technical colleges. We evaluated this state welfare policy while simultaneously dealing with methodological issues associated with the use of the different state administrative data sets. Community colleges are natural partners in states' attempts to help welfare recipients in their transitions from public assistance to work. Historically, two-year colleges have served older and disadvantaged students, and so they have a tradition of providing the sort of individualized attention to support successfully welfare recipients through to degree completion. Approximately 17,000 individuals had participated in the Ohio JSRP program between its inception in 1990 and summer 1995, the time period for this study. While its inception pre-dates the current new welfare environment, lessons learned from this program serve to inform the ongoing policy debate. The focus of this paper is an evaluation using state administrative data. The empirical work relied on matching state administrative data from three sources: JSRP program participation data collected by the individual community colleges and managed by the state, Ohio Department of Human Resources CRIS-E (welfare) data, and many quarters of the Ohio Bureau of Employment Services wage-record data. There were three major problems with using these data. First there was incomplete information concerning program and degree completion. But we could analyze program participation and link to employment outcomes. Second there were difficulties matching across data sources. For example, not all individuals in the JSRP files were located in the CRIS-E files. Third, there was no random assignment with a true control group. We handled this final problem by constructing a comparison group pulled from the CRIS-E files those individuals in higher education but not participating in JSRP. We were able to merge data across these three sources without any confidentiality problems. Our net impact analysis relied on an unadjusted comparison of means and a regression-adjusted comparison of means for the JSRP group and the constructed comparison group. The comparison group was comprised of JOBS clients in the ODHS CRIS-E file with twelve or more years of schooling who were assigned to higher education as their JOBS component. To assure as much consistency with the JSRP group as possible, all those in the latter group reporting fewer than twelve years of education were excluded from this portion of the empirical analyses. Outcomes included employment, earnings, and welfare recipiency. Two definitions of JSRP participation were included: one indicating any participation and one for program completion. JSRP appeared to boost earnings. Focusing on the most recent 11 of the 16 total quarters of data available, the average boost to earnings across quarters accruing to program participation was 8.45 percent. Focusing just on program completion yielded an estimated boost to earnings of 12.91 percent. While there are some imperfections with this empirical work due to the approximate nature of the comparison group, the results indicate that encouraging postsecondary education for a subset of welfare recipients might help to boost earnings capacity and therefore long-term self-sufficiency. Also, the paper shows the benefit of using readily available state administrative data to evaluate policy.
If you experience problems downloading a file, check if you have the proper application to view it first. In case of further problems read the IDEAS help page. Note that these files are not on the IDEAS site. Please be patient as the files may be large.
As the access to this document is restricted, you may want to look for a different version under "Related research" (further below) or search for a different version of it.
|Date of creation:||Jul 1997|
|Date of revision:|
|Contact details of provider:|| Postal: |
Web page: http://www.upjohn.org
More information through EDIRC
References listed on IDEAS
Please report citation or reference errors to , or , if you are the registered author of the cited work, log in to your RePEc Author Service profile, click on "citations" and make appropriate adjustments.:
- James J. Heckman, 1989.
"Choosing Among Alternative Nonexperimental Methods for Estimating the Impact of Social Programs: The Case of Manpower Training,"
NBER Working Papers
2861, National Bureau of Economic Research, Inc.
- Heckman, J.J. & Hotz, V.J., 1988. "Choosing Among Alternative Nonexperimental Methods For Estimating The Impact Of Social Programs: The Case Of Manpower Training," University of Chicago - Economics Research Center 88-12, Chicago - Economics Research Center.
- Rajeev H. Dehejia & Sadek Wahba, 1998. "Causal Effects in Non-Experimental Studies: Re-Evaluating the Evaluation of Training Programs," NBER Working Papers 6586, National Bureau of Economic Research, Inc.
When requesting a correction, please mention this item's handle: RePEc:upj:weupjo:97-48. See general information about how to correct material in RePEc.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: ()
If references are entirely missing, you can add them using this form.