IDEAS home Printed from https://ideas.repec.org/p/nbr/nberwo/6586.html
   My bibliography  Save this paper

Causal Effects in Non-Experimental Studies: Re-Evaluating the Evaluation of Training Programs

Author

Listed:
  • Rajeev H. Dehejia
  • Sadek Wahba

Abstract

This paper uses propensity score methods to address the question: how well can an observational study estimate the treatment impact of a program? Using data from Lalonde's (1986) influential evaluation of non-experimental methods, we demonstrate that propensity score methods succeed in estimating the treatment impact of the National Supported Work Demonstration. Propensity score methods reduce the task of controlling for differences in pre-intervention variables between the treatment and the non-experimental comparison groups to controlling for differences in the estimated propensity score (the probability of assignment to treatment, conditional on covariates). It is difficult to control for differences in pre-intervention variables when they are numerous and when the treatment and comparison groups are dissimilar, whereas controlling for the estimated propensity score, a single variable on the unit interval, is a straightforward task. We apply several methods, such as stratification on the propensity score and matching on the propensity score, and show that they result in accurate estimates of the treatment impact.

Suggested Citation

  • Rajeev H. Dehejia & Sadek Wahba, 1998. "Causal Effects in Non-Experimental Studies: Re-Evaluating the Evaluation of Training Programs," NBER Working Papers 6586, National Bureau of Economic Research, Inc.
  • Handle: RePEc:nbr:nberwo:6586
    Note: LS
    as

    Download full text from publisher

    File URL: http://www.nber.org/papers/w6586.pdf
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. LaLonde, Robert J, 1986. "Evaluating the Econometric Evaluations of Training Programs with Experimental Data," American Economic Review, American Economic Association, vol. 76(4), pages 604-620, September.
    2. Angrist, Joshua D, 1990. "Lifetime Earnings and the Vietnam Era Draft Lottery: Evidence from Social Security Administrative Records," American Economic Review, American Economic Association, vol. 80(3), pages 313-336, June.
    3. Orley Ashenfelter, 1974. "The Effect of Manpower Training Earnings: Preliminary Results," Working Papers 440, Princeton University, Department of Economics, Industrial Relations Section..
    4. Card, David & Sullivan, Daniel G, 1988. "Measuring the Effect of Subsidized Training Programs on Movements in and out of Employment," Econometrica, Econometric Society, vol. 56(3), pages 497-530, May.
    5. Angrist, Joshua D, 1990. "Lifetime Earnings and the Vietnam Era Draft Lottery: Evidence from Social Security Administrative Records: Errata," American Economic Review, American Economic Association, vol. 80(5), pages 1284-1286, December.
    6. repec:pri:indrel:dsp01pn89d6584 is not listed on IDEAS
    7. Heckman, James J. & Robb, Richard Jr., 1985. "Alternative methods for evaluating the impact of interventions : An overview," Journal of Econometrics, Elsevier, vol. 30(1-2), pages 239-267.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. A. Smith, Jeffrey & E. Todd, Petra, 2005. "Does matching overcome LaLonde's critique of nonexperimental estimators?," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 305-353.
    2. van der Klaauw, Bas, 2014. "From micro data to causality: Forty years of empirical labor economics," Labour Economics, Elsevier, vol. 30(C), pages 88-97.
    3. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    4. Peter Hull & Michal Kolesár & Christopher Walters, 2022. "Labor by design: contributions of David Card, Joshua Angrist, and Guido Imbens," Scandinavian Journal of Economics, Wiley Blackwell, vol. 124(3), pages 603-645, July.
    5. Joshua D. Angrist, 2022. "Empirical Strategies in Economics: Illuminating the Path From Cause to Effect," Econometrica, Econometric Society, vol. 90(6), pages 2509-2539, November.
    6. Robert J. LaLonde, 2003. "Employment and Training Programs," NBER Chapters, in: Means-Tested Transfer Programs in the United States, pages 517-586, National Bureau of Economic Research, Inc.
    7. Imbens, Guido W & Angrist, Joshua D, 1994. "Identification and Estimation of Local Average Treatment Effects," Econometrica, Econometric Society, vol. 62(2), pages 467-475, March.
    8. Alberto Abadie & Guido W. Imbens, 2002. "Simple and Bias-Corrected Matching Estimators for Average Treatment Effects," NBER Technical Working Papers 0283, National Bureau of Economic Research, Inc.
    9. Anna Piil Damm, 2009. "Ethnic Enclaves and Immigrant Labor Market Outcomes: Quasi-Experimental Evidence," Journal of Labor Economics, University of Chicago Press, vol. 27(2), pages 281-314, April.
    10. Markus Frölich, 2004. "Programme Evaluation with Multiple Treatments," Journal of Economic Surveys, Wiley Blackwell, vol. 18(2), pages 181-224, April.
    11. Joshua D. Angrist & Jörn-Steffen Pischke, 2010. "The Credibility Revolution in Empirical Economics: How Better Research Design Is Taking the Con out of Econometrics," Journal of Economic Perspectives, American Economic Association, vol. 24(2), pages 3-30, Spring.
    12. Guido W. Imbens, 2022. "Causality in Econometrics: Choice vs Chance," Econometrica, Econometric Society, vol. 90(6), pages 2541-2566, November.
    13. Angrist, J.D. & Imbens, G.W., 1991. "Sources of Identifying Information in Evaluation Models," Harvard Institute of Economic Research Working Papers 1568, Harvard - Institute of Economic Research.
    14. Hu, Yingyao, 2017. "The Econometrics of Unobservables -- Latent Variable and Measurement Error Models and Their Applications in Empirical Industrial Organization and Labor Economics [The Econometrics of Unobservables]," Economics Working Paper Archive 64578, The Johns Hopkins University,Department of Economics, revised 2021.
    15. Rösner, Anja & Haucap, Justus & Heimeshoff, Ulrich, 2020. "The impact of consumer protection in the digital age: Evidence from the European Union," International Journal of Industrial Organization, Elsevier, vol. 73(C).
    16. Rafael Di Tella & Ernesto Schargrodsky, 2004. "Do Police Reduce Crime? Estimates Using the Allocation of Police Forces After a Terrorist Attack," American Economic Review, American Economic Association, vol. 94(1), pages 115-133, March.
    17. Duflo, Esther & Glennerster, Rachel & Kremer, Michael, 2008. "Using Randomization in Development Economics Research: A Toolkit," Handbook of Development Economics, in: T. Paul Schultz & John A. Strauss (ed.), Handbook of Development Economics, edition 1, volume 4, chapter 61, pages 3895-3962, Elsevier.
    18. Cornelissen, Thomas & Dustmann, Christian & Raute, Anna & Schönberg, Uta, 2016. "From LATE to MTE: Alternative methods for the evaluation of policy interventions," Labour Economics, Elsevier, vol. 41(C), pages 47-60.
    19. Per-Anders Edin & Peter Fredriksson & Olof Åslund, 2003. "Ethnic Enclaves and the Economic Success of Immigrants—Evidence from a Natural Experiment," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 118(1), pages 329-357.
    20. Idrisov, Georgiy (Идрисов, Георгий) & Taganov, B.V. (Таганов, Б.), 2016. "Research of the Effect of Growth of Openness of the Russian Economy on Income Inequality in Russia [Исследование Влияния Роста Открытости Российской Экономики На Неравенство Доходов Населения В Рос," Working Papers 3136, Russian Presidential Academy of National Economy and Public Administration.

    More about this item

    JEL classification:

    • C81 - Mathematical and Quantitative Methods - - Data Collection and Data Estimation Methodology; Computer Programs - - - Methodology for Collecting, Estimating, and Organizing Microeconomic Data; Data Access
    • C14 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Semiparametric and Nonparametric Methods: General

    NEP fields

    This paper has been announced in the following NEP Reports:

    Lists

    This item is featured on the following reading lists, Wikipedia, or ReplicationWiki pages:
    1. Causal Effects in Nonexperimental Studies: Reevaluating the Evaluation of Training Programs (Journal of the American Statistical Association (JASA) 1999) in ReplicationWiki

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nbr:nberwo:6586. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: the person in charge (email available below). General contact details of provider: https://edirc.repec.org/data/nberrus.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.