IDEAS home Printed from https://ideas.repec.org/a/sae/evarev/v38y2014i5p359-387.html
   My bibliography  Save this article

Flaws in Evaluations of Social Programs

Author

Listed:
  • David Greenberg
  • Burt S. Barnow

Abstract

Background: This article describes eight flaws that occur in impact evaluations. Method: The eight flaws are grouped into four categories on how they affect impact estimates: statistical imprecision; biases; failure of impact estimates to measure effects of the planned treatment; and flaws that result from weakening an evaluation design. Each flaw is illustrated with examples from social experiments. Although these illustrations are from randomized controlled trials (RCTs), they can occur in any type of evaluation; we use RCTs to illustrate because people sometimes assume that RCTs might be immune to such problems. A summary table lists the flaws, indicates circumstances under which they occur, notes their potential seriousness, and suggests approaches for minimizing them. Results: Some of the flaws result in minor hurdles, while others cause evaluations to fail—that is, the evaluation is unable to provide a valid test of the hypothesis of interest. The flaws that appear to occur most frequently are response bias resulting from attrition, failure to adequately implement the treatment as designed, and too small a sample to detect impacts. The third of these can result from insufficient marketing, too small an initial target group, disinterest on the part of the target group in participating (if the treatment is voluntary), or attrition. Conclusion To a considerable degree, the flaws we discuss can be minimized. For instance, implementation failures and too small a sample can usually be avoided with sufficient planning, and response bias can often be mitigated—for example, through increased follow-up efforts in conducting surveys.

Suggested Citation

  • David Greenberg & Burt S. Barnow, 2014. "Flaws in Evaluations of Social Programs," Evaluation Review, , vol. 38(5), pages 359-387, October.
  • Handle: RePEc:sae:evarev:v:38:y:2014:i:5:p:359-387
    DOI: 10.1177/0193841X14545782
    as

    Download full text from publisher

    File URL: https://journals.sagepub.com/doi/10.1177/0193841X14545782
    Download Restriction: no

    File URL: https://libkey.io/10.1177/0193841X14545782?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Miriam Bruhn & David McKenzie, 2009. "In Pursuit of Balance: Randomization in Practice in Development Field Experiments," American Economic Journal: Applied Economics, American Economic Association, vol. 1(4), pages 200-232, October.
    2. Manski, Charles F, 1990. "Nonparametric Bounds on Treatment Effects," American Economic Review, American Economic Association, vol. 80(2), pages 319-323, May.
    3. David Greenberg & Mark Shroder & Matthew Onstott, 1999. "The Social Experiment Market," Journal of Economic Perspectives, American Economic Association, vol. 13(3), pages 157-172, Summer.
    4. Robert B. Olsen & Larry L. Orr & Stephen H. Bell & Elizabeth A. Stuart, 2013. "External Validity in Policy Evaluations That Choose Sites Purposively," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 32(1), pages 107-121, January.
    5. repec:mpr:mprres:1795 is not listed on IDEAS
    6. repec:mpr:mprres:3656 is not listed on IDEAS
    7. Glenn W. Harrison & John A. List, 2004. "Field Experiments," Journal of Economic Literature, American Economic Association, vol. 42(4), pages 1009-1055, December.
    8. Burt S. Barnow & David Greenberg, 2013. "Replication issues in social experiments: lessons from US labor market programs [Probleme bei sozialen Experimenten: Lehren aus US-amerikanischen Arbeitsmarktprogrammen]," Journal for Labour Market Research, Springer;Institute for Employment Research/ Institut für Arbeitsmarkt- und Berufsforschung (IAB), vol. 46(3), pages 239-252, September.
    9. Rachel Glennerster & Kudzai Takavarasha, 2013. "Running Randomized Evaluations: A Practical Guide," Economics Books, Princeton University Press, edition 1, number 10085.
    10. Joshua D. Angrist & Victor Lavy, 2002. "The Effect of High School Matriculation Awards: Evidence from Randomized Trials," NBER Working Papers 9389, National Bureau of Economic Research, Inc.
    11. John A. List, 2011. "Why Economists Should Conduct Field Experiments and 14 Tips for Pulling One Off," Journal of Economic Perspectives, American Economic Association, vol. 25(3), pages 3-16, Summer.
    12. repec:mpr:mprres:6088 is not listed on IDEAS
    13. Devine, Joel A. & Brody, Charles J. & Wright, James D., 1997. "Evaluating an alcohol and drug treatment program for the homeless: An econometric approach," Evaluation and Program Planning, Elsevier, vol. 20(2), pages 205-215, May.
    14. David H. Greenberg & Charles Michalopoulos & Philip K. Robins, 2003. "A Meta-Analysis of Government-Sponsored Training Programs," ILR Review, Cornell University, ILR School, vol. 57(1), pages 31-53, October.
    15. Michael Rosholm & Lars Skipper, 2009. "Is labour market training a curse for the unemployed? Evidence from a social experiment," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 24(2), pages 338-365, March.
    16. Myles Maxfield & Laura Castner & Vida Maralani & Mary Vencill, 2003. "The Quantum Opportunity Program Demonstration: Implementation Findings," Mathematica Policy Research Reports 454e8a1ad16943249a9f9577d, Mathematica Policy Research.
    17. repec:feb:artefa:0110 is not listed on IDEAS
    18. Howard S. Bloom & Carolyn J. Hill & James A. Riccio, 2003. "Linking program implementation and effectiveness: Lessons from a pooled sample of welfare-to-work experiments," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 22(4), pages 551-575.
    19. Neil S. Seftor & Arif Mamun & Allen Schirm, "undated". "The Impacts of Regular Upward Bound on Postsecondary Outcomes 7-9 Years After Scheduled High School Graduation," Mathematica Policy Research Reports 2ee3fcef720a4257a79beeea2, Mathematica Policy Research.
    20. Abhijit Banerjee & Esther Duflo & Rachel Glennerster, 2011. "Is Decentralized Iron Fortification a Feasible Option to Fight Anemia Among the Poorest?," NBER Chapters, in: Explorations in the Economics of Aging, pages 317-344, National Bureau of Economic Research, Inc.
    21. John Newman & Menno Pradhan & Laura B. Rawlings & Geert Ridder & Ramiro Coa & Jose Luis Evia, 2002. "An Impact Evaluation of Education, Health, and Water Supply Investments by the Bolivian Social Investment Fund," The World Bank Economic Review, World Bank, vol. 16(2), pages 241-274, August.
    22. Mark Dynarski & Robert Wood, 1997. "Helping High-Risk Youths: Results from the Alternative Schools Demonstration Program," Mathematica Policy Research Reports b10f23da06064e5ca34f56a27, Mathematica Policy Research.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Ben Weidmann & Luke Miratrix, 2021. "Missing, presumed different: Quantifying the risk of attrition bias in education evaluations," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 184(2), pages 732-760, April.
    2. John Deke & Hanley Chiang, 2017. "The WWC Attrition Standard," Evaluation Review, , vol. 41(2), pages 130-154, April.
    3. Kaitlin Anderson & Gema Zamarro & Jennifer Steele & Trey Miller, 2021. "Comparing Performance of Methods to Deal With Differential Attrition in Randomized Experimental Evaluations," Evaluation Review, , vol. 45(1-2), pages 70-104, February.
    4. Arnaud Vaganay, 2016. "Cluster Sampling Bias in Government-Sponsored Evaluations: A Correlational Study of Employment and Welfare Pilots in England," PLOS ONE, Public Library of Science, vol. 11(8), pages 1-21, August.
    5. John Deke & Thomas Wei & Tim Kautz, "undated". "Asymdystopia: The Threat of Small Biases in Evaluations of Education Interventions that Need to be Powered to Detect Small Impacts," Mathematica Policy Research Reports f0ff8f86e3c34dc8baaf22b56, Mathematica Policy Research.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    2. Grosch, Kerstin & Haeckl, Simone & Rau, Holger & Preuss, Paul, 2023. "A Guide to Conducting School Experiments: Expert Insights and Best Practices for Effective Implementation," UiS Working Papers in Economics and Finance 2023/2, University of Stavanger.
    3. Guido Friebel & Matthias Heinz & Miriam Krueger & Nikolay Zubanov, 2017. "Team Incentives and Performance: Evidence from a Retail Chain," American Economic Review, American Economic Association, vol. 107(8), pages 2168-2203, August.
    4. Kjetil Bjorvatn & Alexander W. Cappelen & Linda Helgesson Sekei & Erik Ø. Sørensen & Bertil Tungodden, 2020. "Teaching Through Television: Experimental Evidence on Entrepreneurship Education in Tanzania," Management Science, INFORMS, vol. 66(6), pages 2308-2325, June.
    5. Pedro Carneiro & Sokbae Lee & Daniel Wilhelm, 2020. "Optimal data collection for randomized control trials [Microcredit impacts: Evidence from a randomized microcredit program placement experiment by Compartamos Banco]," The Econometrics Journal, Royal Economic Society, vol. 23(1), pages 1-31.
    6. Susan Athey & Raj Chetty & Guido Imbens, 2020. "Combining Experimental and Observational Data to Estimate Treatment Effects on Long Term Outcomes," Papers 2006.09676, arXiv.org.
    7. Omar Al-Ubaydli & John List, 2016. "Field Experiments in Markets," Artefactual Field Experiments j0002, The Field Experiments Website.
    8. Lambarraa, Fatima & Riener, Gerhard, 2015. "On the norms of charitable giving in Islam: Two field experiments in Morocco," Journal of Economic Behavior & Organization, Elsevier, vol. 118(C), pages 69-84.
    9. Baldwin, Kate & Bhavnani, Rikhil R., 2013. "Ancillary Experiments: Opportunities and Challenges," WIDER Working Paper Series 024, World Institute for Development Economic Research (UNU-WIDER).
    10. Spermann, Alexander & Strotmann, Harald, 2005. "The Targeted Negative Income Tax (TNIT) in Germany: Evidence from a Quasi Experiment," ZEW Discussion Papers 05-68, ZEW - Leibniz Centre for European Economic Research.
    11. Omar Al-Ubaydli & John List, 2013. "On the Generalizability of Experimental Results in Economics: With A Response To Camerer," Artefactual Field Experiments j0001, The Field Experiments Website.
    12. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    13. Daniel Gubits & David Stapleton & Stephen Bell & Michelle Wood & Denise Hoffman & Sarah Croake & David R. Mann & Judy Geyer & David Greenberg & Austin Nichols & Andrew McGuirk & Meg Carroll & Utsav Ka, "undated". "BOND Implementation and Evaluation: Final Evaluation Report, Volume 1," Mathematica Policy Research Reports fac39cd85b944c528e7acbb5d, Mathematica Policy Research.
    14. Heather Koball & Robin Dion & Andrew Gothro & Maura Bardos & Amy Dworsky & Jiffy Lansing & Matthew Stagner & Danijela Korom-Djakovic & Carla Herrera & Alice Elizabeth Manning, "undated". "Synthesis of Research and Resources to Support At-Risk Youth," Mathematica Policy Research Reports 8353b63284d94941bcb778e1c, Mathematica Policy Research.
    15. Gosnell, Greer K., 2018. "Communicating Resourcefully: A Natural Field Experiment on Environmental Framing and Cognitive Dissonance in Going Paperless," Ecological Economics, Elsevier, vol. 154(C), pages 128-144.
    16. Jason T. Kerwin & Rebecca L. Thornton, 2021. "Making the Grade: The Sensitivity of Education Program Effectiveness to Input Choices and Outcome Measures," The Review of Economics and Statistics, MIT Press, vol. 103(2), pages 251-264, May.
    17. Su, Duan & Wang, Yacan & Yang, Nan & Wang, Xianghong, 2020. "Promoting considerate parking behavior in dockless bike-sharing: An experimental study," Transportation Research Part A: Policy and Practice, Elsevier, vol. 140(C), pages 153-165.
    18. Carpena, Fenella & Zia, Bilal, 2020. "The causal mechanism of financial education: Evidence from mediation analysis," Journal of Economic Behavior & Organization, Elsevier, vol. 177(C), pages 143-184.
    19. Bordunos, A. & Kokoulina, L. & Ermolaeva, L., 2015. "Role of enterprise gamified system in fostering innovation capacity: A field experiment," Working Papers 6420, Graduate School of Management, St. Petersburg State University.
    20. Pedro Carneiro & Sokbae (Simon) Lee & Daniel Wilhelm, 2016. "Optimal data collection for randomized control trials," CeMMAP working papers 15/16, Institute for Fiscal Studies.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:sae:evarev:v:38:y:2014:i:5:p:359-387. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: SAGE Publications (email available below). General contact details of provider: .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.