IDEAS home Printed from https://ideas.repec.org/a/sae/evarev/v42y2018i5-6p515-549.html
   My bibliography  Save this article

Decomposing Differences in Impacts on Survey- and Administrative-Measured Earnings From a Job Training Voucher Experiment

Author

Listed:
  • Quinn Moore
  • Irma Perez-Johnson
  • Robert Santillano

Abstract

Background: Differences in earnings measured using either survey or administrative data raise the question of which is preferred for program impact evaluations. This is especially true when the population of interest has varying propensities to be represented in either source. Objectives: We aim to study differences in impacts on earnings from a job training voucher experiment in order to demonstrate which source is most appropriate to interpret findings. Research design: Using study participants with survey-reported earnings, we decompose mean earnings differences across sources into those resulting from (1) differences in reported employment and (2) differences in reported earnings for those who are employed in both sources. We study factors related to these two sources of differences and demonstrate how impact estimates change when adjusting for them. Results: We find that differences in mean earnings are driven by differences in reported employment, but that differences in impacts are driven by differences in reported earnings for those employed in both data sources. Employment and worker characteristics explain much of the research group differences in earnings among the employed. Out-of-state employment, self-employment, and employment in low unemployment insurance (UI) coverage occupations contribute importantly to research group differences in survey- and UI-based employment levels. Employment in more than one job contributes to treatment group differences in earnings among the employed. All of these factors contribute substantially to the difference between survey- and UI-based earnings impact estimates. Conclusion: Findings underscore the relevance of UI coverage to estimated earnings impacts and suggest assessing employment impacts using both UI- and survey-based measures.

Suggested Citation

  • Quinn Moore & Irma Perez-Johnson & Robert Santillano, 2018. "Decomposing Differences in Impacts on Survey- and Administrative-Measured Earnings From a Job Training Voucher Experiment," Evaluation Review, , vol. 42(5-6), pages 515-549, October.
  • Handle: RePEc:sae:evarev:v:42:y:2018:i:5-6:p:515-549
    DOI: 10.1177/0193841X18799434
    as

    Download full text from publisher

    File URL: https://journals.sagepub.com/doi/10.1177/0193841X18799434
    Download Restriction: no

    File URL: https://libkey.io/10.1177/0193841X18799434?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. John M. Abowd & Martha H. Stinson, 2013. "Estimating Measurement Error in Annual Job Earnings: A Comparison of Survey and Administrative Data," The Review of Economics and Statistics, MIT Press, vol. 95(5), pages 1451-1467, December.
    2. David H. Autor & David Dorn, 2013. "The Growth of Low-Skill Service Jobs and the Polarization of the US Labor Market," American Economic Review, American Economic Association, vol. 103(5), pages 1553-1597, August.
    3. Bound, John & Brown, Charles & Mathiowetz, Nancy, 2001. "Measurement error in survey data," Handbook of Econometrics, in: J.J. Heckman & E.E. Leamer (ed.), Handbook of Econometrics, edition 1, volume 5, chapter 59, pages 3705-3843, Elsevier.
    4. Bound, John & Krueger, Alan B, 1991. "The Extent of Measurement Error in Longitudinal Earnings Data: Do Two Wrongs Make a Right?," Journal of Labor Economics, University of Chicago Press, vol. 9(1), pages 1-24, January.
    5. Kornfeld, Robert & Bloom, Howard S, 1999. "Measuring Program Impacts on Earnings and Employment: Do Unemployment Insurance Wage Reports from Employers Agree with Surveys of Individuals?," Journal of Labor Economics, University of Chicago Press, vol. 17(1), pages 168-197, January.
    6. Burt S. Barnow & David Greenberg, 2015. "Do Estimated Impacts on Earnings Depend on the Source of the Data Used to Measure Them? Evidence From Previous Social Experiments," Evaluation Review, , vol. 39(2), pages 179-228, April.
    7. Richard Dorsett & Richard Hendra & Philip K. Robins, 2018. "Using Administrative Data to Explore the Effect of Survey Nonresponse in the UK Employment Retention and Advancement Demonstration," Evaluation Review, , vol. 42(5-6), pages 491-514, October.
    8. V. J. Hotz & J. K. Scholz, "undated". "Measuring Employment and Income for Low-Income Populations with Administrative and Survey Data," Institute for Research on Poverty Discussion Papers 1224-01, University of Wisconsin Institute for Research on Poverty.
    9. Blakemore, Arthur E & Burgess, Paul L & Low, Stuart A & St Louis, Robert D, 1996. "Employer Tax Evasion in the Unemployment Insurance Program," Journal of Labor Economics, University of Chicago Press, vol. 14(2), pages 210-230, April.
    10. Katharine G. Abraham & John Haltiwanger & Kristin Sandusky & James R. Spletzer, 2013. "Exploring Differences in Employment between Household and Establishment Data," Journal of Labor Economics, University of Chicago Press, vol. 31(S1), pages 129-172.
    11. Arie Kapteyn & Jelmer Y. Ypma, 2007. "Measurement Error and Misclassification: A Comparison of Survey and Administrative Data," Journal of Labor Economics, University of Chicago Press, vol. 25(3), pages 513-551.
    12. Peter Z. Schochet & John Burghardt & Sheena McConnell, 2008. "Does Job Corps Work? Impact Findings from the National Job Corps Study," American Economic Review, American Economic Association, vol. 98(5), pages 1864-1886, December.
    13. repec:mpr:mprres:6097 is not listed on IDEAS
    14. Geoffrey L. Wallace & Robert Haveman, 2007. "The implications of differences between employer and worker employment|earnings reports for policy evaluation," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 26(4), pages 737-754.
    15. Irma Perez-Johnson & Quinn Moore & Robert Santillano, "undated". "Improving the Effectiveness of Individual Training Accounts: Long-Term Findings from an Experimental Evaluation of Three Service Delivery Models," Mathematica Policy Research Reports ddb772ea2aa242a1a74579eb0, Mathematica Policy Research.
    16. Pischke, Jorn-Steffen, 1995. "Measurement Error and Earnings Dynamics: Some Estimates from the PSID Validation Study," Journal of Business & Economic Statistics, American Statistical Association, vol. 13(3), pages 305-314, July.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Burt S. Barnow & David Greenberg, 2015. "Do Estimated Impacts on Earnings Depend on the Source of the Data Used to Measure Them? Evidence From Previous Social Experiments," Evaluation Review, , vol. 39(2), pages 179-228, April.
    2. Andersson, Fredrik W. & Holzer, Harry J. & Lane, Julia & Rosenblum, David & Smith, Jeffrey A., 2013. "Does Federally-Funded Job Training Work? Nonexperimental Estimates of WIA Training Impacts Using Longitudinal Data on Workers and Firms," IZA Discussion Papers 7621, Institute of Labor Economics (IZA).
    3. Dean R. Hyslop & Wilbur Townsend, 2020. "Earnings Dynamics and Measurement Error in Matched Survey and Administrative Data," Journal of Business & Economic Statistics, Taylor & Francis Journals, vol. 38(2), pages 457-469, April.
    4. Stefan Angel & Richard Heuberger & Nadja Lamei, 2018. "Differences Between Household Income from Surveys and Registers and How These Affect the Poverty Headcount: Evidence from the Austrian SILC," Social Indicators Research: An International and Interdisciplinary Journal for Quality-of-Life Measurement, Springer, vol. 138(2), pages 575-603, July.
    5. Lachowska, Marta & Mas, Alexandre & Woodbury, Stephen A., 2022. "How reliable are administrative reports of paid work hours?," Labour Economics, Elsevier, vol. 75(C).
    6. Hyslop, Dean R. & Townsend, Wilbur, 2017. "Employment misclassification in survey and administrative reports," Economics Letters, Elsevier, vol. 155(C), pages 19-23.
    7. Katharine G. Abraham & John Haltiwanger & Kristin Sandusky & James R. Spletzer, 2013. "Exploring Differences in Employment between Household and Establishment Data," Journal of Labor Economics, University of Chicago Press, vol. 31(S1), pages 129-172.
    8. Burt S. Barnow & David H. Greenberg, 2019. "Special Issue Editors’ Essay," Evaluation Review, , vol. 43(5), pages 231-265, October.
    9. Christian Imboden & John Voorheis & Caroline Weber, 2023. "Self-Employment Income Reporting on Surveys," Working Papers 23-19, Center for Economic Studies, U.S. Census Bureau.
    10. Burt S. Barnow & Jeffrey Smith, 2015. "Employment and Training Programs," NBER Chapters, in: Economics of Means-Tested Transfer Programs in the United States, Volume 2, pages 127-234, National Bureau of Economic Research, Inc.
    11. Michele Lalla & Patrizio Frederic & Daniela Mantovani, 2022. "The inextricable association of measurement errors and tax evasion as examined through a microanalysis of survey data matched with fiscal data: a case study," Statistical Methods & Applications, Springer;Società Italiana di Statistica, vol. 31(5), pages 1375-1401, December.
    12. Paulus, Alari, 2015. "Tax evasion and measurement error: An econometric analysis of survey data linked with tax records," ISER Working Paper Series 2015-10, Institute for Social and Economic Research.
    13. Robert Moffitt & Sisi Zhang, 2022. "Estimating Trends in Male Earnings Volatility with the Panel Study of Income Dynamics," Journal of Business & Economic Statistics, Taylor & Francis Journals, vol. 41(1), pages 20-25, December.
    14. Michele Lalla & Maddalena Cavicchioli, 2020. "Nonresponse and measurement errors in income: matching individual survey data with administrative tax data," Department of Economics 0170, University of Modena and Reggio E., Faculty of Economics "Marco Biagi".
    15. Bruce D. Meyer & Nikolas Mittag, 2015. "Using Linked Survey and Administrative Data to Better Measure Income: Implications for Poverty, Program Effectiveness and Holes in the Safety Net," NBER Working Papers 21676, National Bureau of Economic Research, Inc.
    16. Meyer, Bruce D. & Mittag, Nikolas, 2017. "Using Linked Survey and Administrative Data to Better Measure Income: Implications for Poverty, Program Effectiveness and Holes in the Safety Net," IZA Discussion Papers 10943, Institute of Labor Economics (IZA).
    17. Adam Bee & Joshua Mitchell & Nikolas Mittag & Jonathan Rothbaum & Carl Sanders & Lawrence Schmidt & Matthew Unrath, 2023. "National Experimental Wellbeing Statistics - Version 1," Working Papers 23-04, Center for Economic Studies, U.S. Census Bureau.
    18. Andrew S. Green, 2017. "Hours Off the Clock," Working Papers 17-44, Center for Economic Studies, U.S. Census Bureau.
    19. ChangHwan Kim & Christopher R. Tamborini, 2014. "Response Error in Earnings," Sociological Methods & Research, , vol. 43(1), pages 39-72, February.
    20. Crossley, Thomas F. & Fisher, Paul & Hussein, Omar, 2023. "Assessing data from summary questions about earnings and income," Labour Economics, Elsevier, vol. 81(C).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:sae:evarev:v:42:y:2018:i:5-6:p:515-549. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: SAGE Publications (email available below). General contact details of provider: .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.