IDEAS home Printed from https://ideas.repec.org/a/sae/evarev/v39y2015i2p179-228.html
   My bibliography  Save this article

Do Estimated Impacts on Earnings Depend on the Source of the Data Used to Measure Them? Evidence From Previous Social Experiments

Author

Listed:
  • Burt S. Barnow
  • David Greenberg

Abstract

Background: Impact evaluations draw their data from two sources, namely, surveys conducted for the evaluation or administrative data collected for other purposes. Both types of data have been used in impact evaluations of social programs. Objective: This study analyzes the causes of differences in impact estimates when survey data and administrative data are used to evaluate earnings impacts in social experiments and discusses the differences observed in eight evaluations of social experiments that used both survey and administrative data. Results: There are important trade-offs between the two data sources. Administrative data are less expensive but may not cover all income and may not cover the time period desired, while surveys can be designed to avoid these problems. We note that errors can be due to nonresponse or reporting, and errors can be balanced between the treatment and the control groups or unbalanced. We find that earnings are usually higher in survey data than in administrative data due to differences in coverage and likely overreporting of overtime hours and pay in survey data. Evaluations using survey data usually find greater impacts, sometimes much greater. Conclusions: The much lower cost of administrative data make their use attractive, but they are still subject to underreporting and other problems. We recommend further evaluations using both types of data with investigative audits to better understand the sources and magnitudes of errors in both survey and administrative data so that appropriate corrections to the data can be made.

Suggested Citation

  • Burt S. Barnow & David Greenberg, 2015. "Do Estimated Impacts on Earnings Depend on the Source of the Data Used to Measure Them? Evidence From Previous Social Experiments," Evaluation Review, , vol. 39(2), pages 179-228, April.
  • Handle: RePEc:sae:evarev:v:39:y:2015:i:2:p:179-228
    DOI: 10.1177/0193841X14564154
    as

    Download full text from publisher

    File URL: https://journals.sagepub.com/doi/10.1177/0193841X14564154
    Download Restriction: no

    File URL: https://libkey.io/10.1177/0193841X14564154?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Claus Thustrup Kreiner & David Dreyer Lassen & Søren Leth-Petersen, 2014. "Measuring the Accuracy of Survey Responses Using Administrative Register Data: Evidence from Denmark," NBER Chapters, in: Improving the Measurement of Consumer Expenditures, pages 289-307, National Bureau of Economic Research, Inc.
    2. James J. Heckman & Hidehiko Ichimura & Petra E. Todd, 1997. "Matching As An Econometric Evaluation Estimator: Evidence from Evaluating a Job Training Programme," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 64(4), pages 605-654.
    3. Bound, John & Krueger, Alan B, 1991. "The Extent of Measurement Error in Longitudinal Earnings Data: Do Two Wrongs Make a Right?," Journal of Labor Economics, University of Chicago Press, vol. 9(1), pages 1-24, January.
    4. Kornfeld, Robert & Bloom, Howard S, 1999. "Measuring Program Impacts on Earnings and Employment: Do Unemployment Insurance Wage Reports from Employers Agree with Surveys of Individuals?," Journal of Labor Economics, University of Chicago Press, vol. 17(1), pages 168-197, January.
    5. Blakemore, Arthur E & Burgess, Paul L & Low, Stuart A & St Louis, Robert D, 1996. "Employer Tax Evasion in the Unemployment Insurance Program," Journal of Labor Economics, University of Chicago Press, vol. 14(2), pages 210-230, April.
    6. Katharine G. Abraham & John Haltiwanger & Kristin Sandusky & James R. Spletzer, 2013. "Exploring Differences in Employment between Household and Establishment Data," Journal of Labor Economics, University of Chicago Press, vol. 31(S1), pages 129-172.
    7. Sheena McConnell & Steven Glazerman, 2001. "National Job Corps Study: The Benefits and Costs of Job Corps," Mathematica Policy Research Reports 19ff8678a108410587c5dfad0, Mathematica Policy Research.
    8. Walter Corson & Paul Decker & Shari Miller Dunstan & Stuart Kerachsky, 1991. "Pennsylvania Reemployment Bonus Demonstration," Mathematica Policy Research Reports e8697d4f57c54149bda369dad, Mathematica Policy Research.
    9. Peter Z. Schochet & John Burghardt & Sheena McConnell, 2006. "National Job Corps Study and Longer-Term Follow-Up Study: Impact and Benefit-Cost Findings Using Survey and Summary Earnings Records Data," Mathematica Policy Research Reports 8074f4e4499d4e2ab1de13747, Mathematica Policy Research.
    10. repec:mpr:mprres:2955 is not listed on IDEAS
    11. David Greenberg & Mark Shroder & Matthew Onstott, 1999. "The Social Experiment Market," Journal of Economic Perspectives, American Economic Association, vol. 13(3), pages 157-172, Summer.
    12. Mellow, Wesley & Sider, Hal, 1983. "Accuracy of Response in Labor Market Surveys: Evidence and Implications," Journal of Labor Economics, University of Chicago Press, vol. 1(4), pages 331-344, October.
    13. repec:mpr:mprres:5840 is not listed on IDEAS
    14. Duncan, Greg J & Hill, Daniel H, 1985. "An Investigation of the Extent and Consequences of Measurement Error in Labor-Economic Survey Data," Journal of Labor Economics, University of Chicago Press, vol. 3(4), pages 508-532, October.
    15. repec:mpr:mprres:856 is not listed on IDEAS
    16. V. J. Hotz & J. K. Scholz, "undated". "Measuring Employment and Income for Low-Income Populations with Administrative and Survey Data," Institute for Research on Poverty Discussion Papers 1224-01, University of Wisconsin Institute for Research on Poverty.
    17. Patricia Anderson & Walter Corson & Paul Decker, "undated". "The New Jersey Unemployment Insurance Reemployment Demonstration Project: Follow-Up Report," Mathematica Policy Research Reports eba060d41b8145b5a230fa76b, Mathematica Policy Research.
    18. Arie Kapteyn & Jelmer Y. Ypma, 2007. "Measurement Error and Misclassification: A Comparison of Survey and Administrative Data," Journal of Labor Economics, University of Chicago Press, vol. 25(3), pages 513-551.
    19. repec:mpr:mprres:4176 is not listed on IDEAS
    20. Greenberg, David & Moffitt, Robert & Friedmann, John, 1981. "Underreporting and Experimental Effects on Work Effort: Evidence from the Gary Income Maintenance Experiment," The Review of Economics and Statistics, MIT Press, vol. 63(4), pages 581-589, November.
    21. Geoffrey L. Wallace & Robert Haveman, 2007. "The implications of differences between employer and worker employment|earnings reports for policy evaluation," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 26(4), pages 737-754.
    22. Irma Perez-Johnson & Quinn Moore & Robert Santillano, "undated". "Improving the Effectiveness of Individual Training Accounts: Long-Term Findings from an Experimental Evaluation of Three Service Delivery Models," Mathematica Policy Research Reports ddb772ea2aa242a1a74579eb0, Mathematica Policy Research.
    23. Bound, John & Brown, Charles & Duncan, Greg J & Rodgers, Willard L, 1994. "Evidence on the Validity of Cross-Sectional and Longitudinal Labor Market Data," Journal of Labor Economics, University of Chicago Press, vol. 12(3), pages 345-368, July.
    24. Greenberg, David & Halsey, Harlan, 1983. "Systematic Misreporting and Effects of Income Maintenance Experiments on Work Effort: Evidence from the Seattle-Denver Experiment," Journal of Labor Economics, University of Chicago Press, vol. 1(4), pages 380-407, October.
    25. repec:mpr:mprres:3604 is not listed on IDEAS
    26. repec:mpr:mprres:1128 is not listed on IDEAS
    27. Walter Corson & Paul Decker & Shari Miller Dunstan & Stuart Kerachsky, 1991. "Pennsylvania Reemployment Bonus Demonstration: Data Set," Mathematica Policy Research Reports 9d5cc87225244f62a34d9ee2e, Mathematica Policy Research.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Quinn Moore & Irma Perez-Johnson & Robert Santillano, 2018. "Decomposing Differences in Impacts on Survey- and Administrative-Measured Earnings From a Job Training Voucher Experiment," Evaluation Review, , vol. 42(5-6), pages 515-549, October.
    2. Kaitlin Anderson & Gema Zamarro & Jennifer Steele & Trey Miller, 2021. "Comparing Performance of Methods to Deal With Differential Attrition in Randomized Experimental Evaluations," Evaluation Review, , vol. 45(1-2), pages 70-104, February.
    3. Burt S. Barnow & Jeffrey Smith, 2015. "Employment and Training Programs," NBER Chapters, in: Economics of Means-Tested Transfer Programs in the United States, Volume 2, pages 127-234, National Bureau of Economic Research, Inc.
    4. Annalisa Mastri & Dana Rotz & Elias S. Hanno, "undated". "Comparing Job Training Impact Estimates Using Survey and Administrative Data," Mathematica Policy Research Reports 157778d936f848ddb0b4e8e32, Mathematica Policy Research.
    5. Sheena McConnell & Peter Z. Schochet & Dana Rotz & Ken Fortson & Paul Burkander & Annalisa Mastri, 2021. "The Effects of Employment Counseling on Labor Market Outcomes for Adults and Dislocated Workers: Evidence from a Nationally Representative Experiment," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 40(4), pages 1249-1287, September.
    6. Fredrik Andersson & Harry J. Holzer & Julia I. Lane & David Rosenblum & Jeffrey Smith, 2024. "Does Federally Funded Job Training Work? Nonexperimental Estimates of WIA Training Impacts Using Longitudinal Data on Workers and Firms," Journal of Human Resources, University of Wisconsin Press, vol. 59(4), pages 1244-1283.
    7. Burt S. Barnow & David H. Greenberg, 2019. "Special Issue Editors’ Essay," Evaluation Review, , vol. 43(5), pages 231-265, October.
    8. Judith Scott-Clayton & Qiao Wen, 2019. "Estimating Returns to College Attainment: Comparing Survey and State Administrative Data–Based Estimates," Evaluation Review, , vol. 43(5), pages 266-306, October.
    9. Reuben Ford & Douwêrê Grékou & Isaac Kwakye & Taylor Shek-wai Hui, 2018. "The Sensitivity of Impact Estimates to Data Sources Used: Analysis From an Access to Postsecondary Education Experiment," Evaluation Review, , vol. 42(5-6), pages 575-615, October.
    10. Richard Dorsett & Richard Hendra & Philip K. Robins, 2018. "Using Administrative Data to Explore the Effect of Survey Nonresponse in the UK Employment Retention and Advancement Demonstration," Evaluation Review, , vol. 42(5-6), pages 491-514, October.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Quinn Moore & Irma Perez-Johnson & Robert Santillano, 2018. "Decomposing Differences in Impacts on Survey- and Administrative-Measured Earnings From a Job Training Voucher Experiment," Evaluation Review, , vol. 42(5-6), pages 515-549, October.
    2. Katharine G. Abraham & John Haltiwanger & Kristin Sandusky & James R. Spletzer, 2013. "Exploring Differences in Employment between Household and Establishment Data," Journal of Labor Economics, University of Chicago Press, vol. 31(S1), pages 129-172.
    3. Brownstone, David & Valletta, Robert G, 1996. "Modeling Earnings Measurement Error: A Multiple Imputation Approach," The Review of Economics and Statistics, MIT Press, vol. 78(4), pages 705-717, November.
    4. Fredrik Andersson & Harry J. Holzer & Julia I. Lane & David Rosenblum & Jeffrey Smith, 2024. "Does Federally Funded Job Training Work? Nonexperimental Estimates of WIA Training Impacts Using Longitudinal Data on Workers and Firms," Journal of Human Resources, University of Wisconsin Press, vol. 59(4), pages 1244-1283.
    5. Lynn, Peter & Jäckle, Annette & Sala, Emanuela & P. Jenkins, Stephen, 2004. "Validation of survey data on income and employment: the ISMIE experience," ISER Working Paper Series 2004-14, Institute for Social and Economic Research.
    6. Burt S. Barnow & Jeffrey Smith, 2015. "Employment and Training Programs," NBER Chapters, in: Economics of Means-Tested Transfer Programs in the United States, Volume 2, pages 127-234, National Bureau of Economic Research, Inc.
    7. Jesse Rothstein & Till von Wachter, 2016. "Social Experiments in the Labor Market," NBER Working Papers 22585, National Bureau of Economic Research, Inc.
    8. Andrew S. Green, 2017. "Hours Off the Clock," Working Papers 17-44, Center for Economic Studies, U.S. Census Bureau.
    9. Kornfeld, Robert & Bloom, Howard S, 1999. "Measuring Program Impacts on Earnings and Employment: Do Unemployment Insurance Wage Reports from Employers Agree with Surveys of Individuals?," Journal of Labor Economics, University of Chicago Press, vol. 17(1), pages 168-197, January.
    10. Bollinger, Christopher R. & Hirsch, Barry & Hokayem, Charles M. & Ziliak, James P., 2018. "Trouble in the Tails? What We Know about Earnings Nonresponse Thirty Years after Lillard, Smith, and Welch," IZA Discussion Papers 11710, Institute of Labor Economics (IZA).
    11. John Abowd & Martha Stinson, 2011. "Estimating Measurement Error in SIPP Annual Job Earnings: A Comparison of Census Bureau Survey and SSA Administrative Data," Working Papers 11-20, Center for Economic Studies, U.S. Census Bureau.
    12. Cyrille Hagneré & Arnaud Lefranc, 2006. "Étendue et conséquences des erreurs de mesure dans les données individuelles d'enquête : une évaluation à partir des données appariées des enquêtes emploi et revenus fiscaux," Economie & Prévision, La Documentation Française, vol. 0(3), pages 131-154.
    13. Lachowska, Marta & Mas, Alexandre & Woodbury, Stephen A., 2022. "How reliable are administrative reports of paid work hours?," Labour Economics, Elsevier, vol. 75(C).
    14. Bollinger, Christopher R, 1998. "Measurement Error in the Current Population Survey: A Nonparametric Look," Journal of Labor Economics, University of Chicago Press, vol. 16(3), pages 576-594, July.
    15. Jungmin Lee & Sokbae Lee, 2012. "Does it Matter WHO Responded to the Survey? Trends in the U.S. Gender Earnings Gap Revisited," ILR Review, Cornell University, ILR School, vol. 65(1), pages 148-160, January.
    16. Jesse Bricker & Gary V. Engelhardt, 2007. "Measurement Error in Earnings Data in the Health and Retirement Study," Working Papers, Center for Retirement Research at Boston College wp2007-16, Center for Retirement Research, revised Oct 2007.
    17. Kristensen, Nicolai & Westergård-Nielsen, Niels C., 2006. "A Large-Scale Validation Study of Measurement Errors in Longitudinal Survey Data," IZA Discussion Papers 2329, Institute of Labor Economics (IZA).
    18. Paulus, Alari, 2015. "Tax evasion and measurement error: An econometric analysis of survey data linked with tax records," ISER Working Paper Series 2015-10, Institute for Social and Economic Research.
    19. Adam Bee & Joshua Mitchell & Nikolas Mittag & Jonathan Rothbaum & Carl Sanders & Lawrence Schmidt & Matthew Unrath, 2023. "National Experimental Wellbeing Statistics - Version 1," Working Papers 23-04, Center for Economic Studies, U.S. Census Bureau.
    20. ChangHwan Kim & Christopher R. Tamborini, 2014. "Response Error in Earnings," Sociological Methods & Research, , vol. 43(1), pages 39-72, February.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:sae:evarev:v:39:y:2015:i:2:p:179-228. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: SAGE Publications (email available below). General contact details of provider: .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.