IDEAS home Printed from https://ideas.repec.org/p/mpr/mprres/7c8bd68ac8db47caa57c70ee11bf988e.html
   My bibliography  Save this paper

Nonexperimental Versus Experimental Estimates of Earnings Impacts

Author

Listed:
  • Steven Glazerman
  • Dan M. Levy
  • David Myers

Abstract

Randomized experiments are desirable but not always possible for social policy research, so "quasi-experiments" have become popular alternatives.

Suggested Citation

  • Steven Glazerman & Dan M. Levy & David Myers, "undated". "Nonexperimental Versus Experimental Estimates of Earnings Impacts," Mathematica Policy Research Reports 7c8bd68ac8db47caa57c70ee1, Mathematica Policy Research.
  • Handle: RePEc:mpr:mprres:7c8bd68ac8db47caa57c70ee11bf988e
    as

    Download full text from publisher

    File URL: http://journals.sagepub.com/doi/abs/10.1177/0002716203254879
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. LaLonde, Robert J, 1986. "Evaluating the Econometric Evaluations of Training Programs with Experimental Data," American Economic Review, American Economic Association, vol. 76(4), pages 604-620, September.
    2. repec:mpr:mprres:2953 is not listed on IDEAS
    3. V. Joseph Hotz & Guido W. Imbens & Jacob A. Klerman, 2000. "The Long-Term Gains from GAIN: A Re-Analysis of the Impacts of the California GAIN Program," NBER Working Papers 8007, National Bureau of Economic Research, Inc.
    4. Arthur J. Reynolds & Judy A. Temple, 1995. "Quasi-Experimental Estimates of the Effects of a Preschool Intervention," Evaluation Review, , vol. 19(4), pages 347-373, August.
    5. repec:mpr:mprres:2956 is not listed on IDEAS
    6. Bratberg, Espen & Grasdal, Astrid & Risa, Alf Erling, 2002. " Evaluating Social Policy by Experimental and Nonexperimental Methods," Scandinavian Journal of Economics, Wiley Blackwell, vol. 104(1), pages 147-171.
    7. Espen Bratberg & Astrid Grasdal & Alf Erling Risa, 2002. "Evaluating Social Policy by Experimental and Nonexperimental Methods," Scandinavian Journal of Economics, Wiley Blackwell, vol. 104(1), pages 147-171, March.
    8. Sheena McConnell & Steven Glazerman, 2001. "National Job Corps Study: The Benefits and Costs of Job Corps," Mathematica Policy Research Reports 19ff8678a108410587c5dfad0, Mathematica Policy Research.
    9. Stephen H. Bell & Larry l. Orr & John D. Blomquist & Glen G. Cain, 1995. "Program Applicants as a Comparison Group in Evaluating Training Programs: Theory and a Test," Books from Upjohn Press, W.E. Upjohn Institute for Employment Research, number pacg, August.
    10. repec:mpr:mprres:2955 is not listed on IDEAS
    11. Heckman, J.J. & Hotz, V.J., 1988. "Choosing Among Alternative Nonexperimental Methods For Estimating The Impact Of Social Programs: The Case Of Manpower Training," University of Chicago - Economics Research Center 88-12, Chicago - Economics Research Center.
    12. V. Joseph Hotz & Guido W. Imbens & Julie H. Mortimer, 1999. "Predicting the Efficacy of Future Training Programs Using Past Experiences," NBER Technical Working Papers 0238, National Bureau of Economic Research, Inc.
    13. James Heckman & Hidehiko Ichimura & Jeffrey Smith & Petra Todd, 1998. "Characterizing Selection Bias Using Experimental Data," Econometrica, Econometric Society, vol. 66(5), pages 1017-1098, September.
    14. Thomas Fraker & Rebecca Maynard, 1987. "The Adequacy of Comparison Group Designs for Evaluations of Employment-Related Programs," Journal of Human Resources, University of Wisconsin Press, vol. 22(2), pages 194-227.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Vivian C. Wong & Peter M. Steiner & Kylie L. Anglin, 2018. "What Can Be Learned From Empirical Evaluations of Nonexperimental Methods?," Evaluation Review, , vol. 42(2), pages 147-175, April.
    2. Katherine Baicker & Theodore Svoronos, 2019. "Testing the Validity of the Single Interrupted Time Series Design," NBER Working Papers 26080, National Bureau of Economic Research, Inc.
    3. Flores, Carlos A. & Mitnik, Oscar A., 2009. "Evaluating Nonexperimental Estimators for Multiple Treatments: Evidence from Experimental Data," IZA Discussion Papers 4451, Institute of Labor Economics (IZA).
    4. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    5. V. Joseph Hotz & Guido W. Imbens & Jacob A. Klerman, 2006. "Evaluating the Differential Effects of Alternative Welfare-to-Work Training Components: A Reanalysis of the California GAIN Program," Journal of Labor Economics, University of Chicago Press, vol. 24(3), pages 521-566, July.
    6. Astrid Grasdal, 2001. "The performance of sample selection estimators to control for attrition bias," Health Economics, John Wiley & Sons, Ltd., vol. 10(5), pages 385-398, July.
    7. Dehejia, Rajeev H., 2005. "Program evaluation as a decision problem," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 141-173.
    8. Katherine Baicker & Theodore Svoronos, 2019. "Testing the Validity of the Single Interrupted Time Series Design," CID Working Papers 364, Center for International Development at Harvard University.
    9. Raaum, Oddbjorn & Torp, Hege, 2002. "Labour market training in Norway--effect on earnings," Labour Economics, Elsevier, vol. 9(2), pages 207-247, April.
    10. Robert J. LaLonde, 2003. "Employment and Training Programs," NBER Chapters, in: Means-Tested Transfer Programs in the United States, pages 517-586, National Bureau of Economic Research, Inc.
    11. James J. Heckman, 1991. "Randomization and Social Policy Evaluation Revisited," NBER Technical Working Papers 0107, National Bureau of Economic Research, Inc.
    12. Burt S. Barnow & Jeffrey Smith, 2015. "Employment and Training Programs," NBER Chapters, in: Economics of Means-Tested Transfer Programs in the United States, Volume 2, pages 127-234, National Bureau of Economic Research, Inc.
    13. James J. Heckman, 1991. "Randomization and Social Policy Evaluation Revisited," NBER Technical Working Papers 0107, National Bureau of Economic Research, Inc.
    14. A. Smith, Jeffrey & E. Todd, Petra, 2005. "Does matching overcome LaLonde's critique of nonexperimental estimators?," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 305-353.
    15. V. Joseph Hotz & Guido W. Imbens & Jacob A. Klerman, 2000. "The Long-Term Gains from GAIN: A Re-Analysis of the Impacts of the California GAIN Program," NBER Working Papers 8007, National Bureau of Economic Research, Inc.
    16. Dehejia Rajeev, 2015. "Experimental and Non-Experimental Methods in Development Economics: A Porous Dialectic," Journal of Globalization and Development, De Gruyter, vol. 6(1), pages 47-69, June.
    17. Peter R. Mueser & Kenneth R. Troske & Alexey Gorislavsky, 2007. "Using State Administrative Data to Measure Program Performance," The Review of Economics and Statistics, MIT Press, vol. 89(4), pages 761-783, November.
    18. David H. Dean & Robert C. Dolan & Robert M. Schmidt, 1999. "Evaluating the Vocational Rehabilitation Program Using Longitudinal Data," Evaluation Review, , vol. 23(2), pages 162-189, April.
    19. Ham, John C. & LaLonde, Robert J., 2005. "Special issue on Experimental and non-experimental evaluation of economic policy and models," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 1-13.
    20. Lechner, Michael & Wunsch, Conny, 2013. "Sensitivity of matching-based program evaluations to the availability of control variables," Labour Economics, Elsevier, vol. 21(C), pages 111-121.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:mpr:mprres:7c8bd68ac8db47caa57c70ee11bf988e. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Joanne Pfleiderer or Cindy George (email available below). General contact details of provider: https://edirc.repec.org/data/mathius.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.