IDEAS home Printed from https://ideas.repec.org/p/iza/izadps/dp13584.html
   My bibliography  Save this paper

Are Program Participants Good Evaluators?

Author

Listed:
  • Smith, Jeffrey A.

    (University of Wisconsin-Madison)

  • Whalley, Alexander

    (University of Calgary)

  • Wilcox, Nathaniel T.

    (Appalachian State University)

Abstract

How well do program participants assess program performance ex-post? In this paper we compare participant evaluations based on survey responses to econometric impact estimates obtained using data from the experimental evaluation of the U.S. Job Training Partnership Act. We have two main findings: First, the participant evaluations are unrelated to the econometric impact estimates. Second, the participant evaluations do covary with impact proxies such as service intensity, outcome levels, and before-after outcome differences. Our results suggest that program participants behave as 'lay scientists' who seek to estimate the impact of the program but face cognitive challenges in doing so.

Suggested Citation

  • Smith, Jeffrey A. & Whalley, Alexander & Wilcox, Nathaniel T., 2020. "Are Program Participants Good Evaluators?," IZA Discussion Papers 13584, Institute of Labor Economics (IZA).
  • Handle: RePEc:iza:izadps:dp13584
    as

    Download full text from publisher

    File URL: https://docs.iza.org/dp13584.pdf
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. James J. Heckman & Carolyn J. Heinrich & Pascal Courty & Gerald Marschke & Jeffrey Smith (ed.), 2011. "The Performance of Performance Standards," Books from Upjohn Press, W.E. Upjohn Institute for Employment Research, number tpps, August.
    2. Kornfeld, Robert & Bloom, Howard S, 1999. "Measuring Program Impacts on Earnings and Employment: Do Unemployment Insurance Wage Reports from Employers Agree with Surveys of Individuals?," Journal of Labor Economics, University of Chicago Press, vol. 17(1), pages 168-197, January.
    3. Rafael Di Tella & Sebastian Galiant & Ernesto Schargrodsky, 2007. "The Formation of Beliefs: Evidence from the Allocation of Land Titles to Squatters," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 122(1), pages 209-241.
    4. James J. Heckman & Jeffrey A. Smith, 2004. "The Determinants of Participation in a Social Program: Evidence from a Prototypical Job Training Program," Journal of Labor Economics, University of Chicago Press, vol. 22(2), pages 243-298, April.
    5. David McKenzie, 2018. "Can Business Owners Form Accurate Counterfactuals? Eliciting Treatment and Control Beliefs About Their Outcomes in the Alternative Treatment Status," Journal of Business & Economic Statistics, Taylor & Francis Journals, vol. 36(4), pages 714-722, October.
    6. James J. Heckman & Jeffrey A. Smith, 1998. "Evaluating the Welfare State," NBER Working Papers 6542, National Bureau of Economic Research, Inc.
    7. F. Thomas Juster, 1966. "Consumer Buying Intentions and Purchase Probability: An Experiment in Survey Design," NBER Books, National Bureau of Economic Research, Inc, number just66-2, May.
    8. Di Tella, Rafael & Galiani, Sebastian & Schargrodsky, Ernesto, 2012. "Reality versus propaganda in the formation of beliefs about privatization," Journal of Public Economics, Elsevier, vol. 96(5), pages 553-567.
    9. Peter Z. Schochet & John Burghardt & Sheena McConnell, 2008. "Does Job Corps Work? Impact Findings from the National Job Corps Study," American Economic Review, American Economic Association, vol. 98(5), pages 1864-1886, December.
    10. James J. Heckman & Jeffrey A. Smith, 1999. "The Pre-Program Earnings Dip and the Determinants of Participation in a Social Program: Implications for Simple Program Evaluation Strategies," NBER Working Papers 6983, National Bureau of Economic Research, Inc.
    11. James Heckman, 1997. "Instrumental Variables: A Study of Implicit Behavioral Assumptions Used in Making Program Evaluations," Journal of Human Resources, University of Wisconsin Press, vol. 32(3), pages 441-462.
    12. Scott E. Carrell & James E. West, 2010. "Does Professor Quality Matter? Evidence from Random Assignment of Students to Professors," Journal of Political Economy, University of Chicago Press, vol. 118(3), pages 409-432, June.
    13. Heckman, James J & Smith, Jeffrey A, 1999. "The Pre-programme Earnings Dip and the Determinants of Participation in a Social Programme. Implications for Simple Programme Evaluation Strategies," Economic Journal, Royal Economic Society, vol. 109(457), pages 313-348, July.
    14. Bell, Stephen H. & Orr, Larry L., 2002. "Screening (and creaming?) applicants to job training programs: the AFDC homemaker-home health aide demonstrations," Labour Economics, Elsevier, vol. 9(2), pages 279-301, April.
    15. James J. Heckman & Jeffrey Smith, 2000. "The Sensitivity of Experimental Impact Estimates (Evidence from the National JTPA Study)," NBER Chapters, in: Youth Employment and Joblessness in Advanced Countries, pages 331-356, National Bureau of Economic Research, Inc.
    16. Smith, Vernon L, 1982. "Microeconomic Systems as an Experimental Science," American Economic Review, American Economic Association, vol. 72(5), pages 923-955, December.
    17. Djebbari, Habiba & Smith, Jeffrey, 2008. "Heterogeneous impacts in PROGRESA," Journal of Econometrics, Elsevier, vol. 145(1-2), pages 64-80, July.
    18. repec:mpr:mprres:6097 is not listed on IDEAS
    19. F. Thomas Juster, 1964. "Anticipations and Purchases: An Analysis of Consumer Behavior," NBER Books, National Bureau of Economic Research, Inc, number just64-1, May.
    20. Wilcox, Nathaniel T, 1993. "Lottery Choice: Incentives, Complexity and Decision Time," Economic Journal, Royal Economic Society, vol. 103(421), pages 1397-1417, November.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Jeffrey Smith, 2022. "Treatment Effect Heterogeneity," Evaluation Review, , vol. 46(5), pages 652-677, October.
    2. Maibom, Jonas, 2021. "The Danish Labor Market Experiments: Methods and Findings," Nationaløkonomisk tidsskrift, Nationaløkonomisk Forening, vol. 2021(1), pages 1-21.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Burt S. Barnow & Jeffrey Smith, 2015. "Employment and Training Programs," NBER Chapters, in: Economics of Means-Tested Transfer Programs in the United States, Volume 2, pages 127-234, National Bureau of Economic Research, Inc.
    2. Andersson, Fredrik W. & Holzer, Harry J. & Lane, Julia & Rosenblum, David & Smith, Jeffrey A., 2013. "Does Federally-Funded Job Training Work? Nonexperimental Estimates of WIA Training Impacts Using Longitudinal Data on Workers and Firms," IZA Discussion Papers 7621, Institute of Labor Economics (IZA).
    3. Barbara Sianesi, 2014. "Dealing with randomisation bias in a social experiment: the case of ERA," IFS Working Papers W14/10, Institute for Fiscal Studies.
    4. Sianesi, Barbara, 2017. "Evidence of randomisation bias in a large-scale social experiment: The case of ERA," Journal of Econometrics, Elsevier, vol. 198(1), pages 41-64.
    5. Lechner Michael & Miquel Ruth & Wunsch Conny, 2007. "The Curse and Blessing of Training the Unemployed in a Changing Economy: The Case of East Germany After Unification," German Economic Review, De Gruyter, vol. 8(4), pages 468-509, December.
    6. James J. Heckman & Jeffrey A. Smith, 1999. "The Pre-Program Earnings Dip and the Determinants of Participation in a Social Program: Implications for Simple Program Evaluation Strategies," NBER Working Papers 6983, National Bureau of Economic Research, Inc.
    7. Zafar E. Nazarov, 2016. "Can Benefits and Work Incentives Counseling be a Path to Future Economic Self-Sufficiency for Individuals with Disabilities?," Journal of Labor Research, Springer, vol. 37(2), pages 211-234, June.
    8. Hartmut Lehmann & Jochen Kluve, 2010. "Assessing Active Labour Market Policies in Transition Economies," AIEL Series in Labour Economics, in: Floro Ernesto Caroleo & Francesco Pastore (ed.), The Labour Market Impact of the EU Enlargement, pages 275-307, Springer.
    9. Marco Caliendo & Sabine Kopeinig, 2008. "Some Practical Guidance For The Implementation Of Propensity Score Matching," Journal of Economic Surveys, Wiley Blackwell, vol. 22(1), pages 31-72, February.
    10. Torres, Miguel Matos & Clegg, L. Jeremy & Varum, Celeste Amorim, 2016. "The missing link between awareness and use in the uptake of pro-internationalization incentives," International Business Review, Elsevier, vol. 25(2), pages 495-510.
    11. Steven Raphael & Michael A. Stoll, 2006. "Evaluating the Effectiveness of the Massachusetts Workforce Development System Using No-Shows as a Nonexperimental Comparison Group," Evaluation Review, , vol. 30(4), pages 379-429, August.
    12. Yonatan Eyal, 2020. "Self-Assessment Variables as a Source of Information in the Evaluation of Intervention Programs: A Theoretical and Methodological Framework," SAGE Open, , vol. 10(1), pages 21582440198, January.
    13. Gockel, Ryan P. & Cullen, Alison C., 2013. "Willing, but Unable: Determinants of Participation Rates for TrainingWorkshops in Central Vietnam," Asian Journal of Agriculture and Rural Development, Asian Economic and Social Society (AESS), vol. 3(10), pages 1-15, October.
    14. Jeffrey Smith, 2000. "A Critical Survey of Empirical Methods for Evaluating Active Labor Market Policies," Swiss Journal of Economics and Statistics (SJES), Swiss Society of Economics and Statistics (SSES), vol. 136(III), pages 247-268, September.
    15. Jeffrey A. Smith, 2018. "The usefulness of experiments," IZA World of Labor, Institute of Labor Economics (IZA), pages 436-436, May.
    16. Charles Blessings Laurence Jumbe & Arild Angelsen, 2006. "Do the Poor Benefit from Devolution Policies? Evidence from Malawi’s Forest Co-Management Program," Land Economics, University of Wisconsin Press, vol. 82(4), pages 562-581.
    17. Andreas Ortman, 2013. "Episodes from the Early History of Experimentation in Economics," Discussion Papers 2013-34, School of Economics, The University of New South Wales.
    18. Dolton, Peter & Smith, Jeffrey A., 2011. "The Impact of the UK New Deal for Lone Parents on Benefit Receipt," IZA Discussion Papers 5491, Institute of Labor Economics (IZA).
    19. Robert J. LaLonde, 2003. "Employment and Training Programs," NBER Chapters, in: Means-Tested Transfer Programs in the United States, pages 517-586, National Bureau of Economic Research, Inc.
    20. Gabrielle Wills, 2016. "Principal leadership changes in South Africa: Investigating their consequences for school performance," Working Papers 01/2016, Stellenbosch University, Department of Economics.

    More about this item

    Keywords

    program evaluation; participant evaluation; surveys;
    All these keywords.

    JEL classification:

    • I28 - Health, Education, and Welfare - - Education - - - Government Policy
    • J24 - Labor and Demographic Economics - - Demand and Supply of Labor - - - Human Capital; Skills; Occupational Choice; Labor Productivity
    • C83 - Mathematical and Quantitative Methods - - Data Collection and Data Estimation Methodology; Computer Programs - - - Survey Methods; Sampling Methods

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:iza:izadps:dp13584. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Holger Hinte (email available below). General contact details of provider: https://edirc.repec.org/data/izaaade.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.