Program Evaluation and Research Designs
AbstractThis chapter provides a selective review of some contemporary approaches to program evaluation. One motivation for our review is the recent emergence and increasing use of a particular kind of “program” in applied microeconomic research, the so-called Regression Discontinuity (RD) Design of Thistlethwaite and Campbell (1960). We organize our discussion of these various research designs by how they secure internal validity: in this view, the RD design can been seen as a close “cousin” of the randomized experiment. An important distinction which emerges from our discussion of “heterogeneous treatment effects” is between ex post (descriptive) and ex ante (predictive) evaluations; these two types of evaluations have distinct, but complementary goals. A second important distinction we make is between statistical statements that are descriptions of our knowledge of the program assignment process and statistical statements that are structural assumptions about individual behavior. Using these distinctions, we examine some commonly employed evaluation strategies, and assess them with a common set of criteria for “internal validity”, the foremost goal of an ex post evaluation. In some cases, we also provide some concrete illustrations of how internally valid causal estimates can be supplemented with specific structural assumptions to address “external validity”: the estimate from an internally valid "experimental" estimate can be viewed as a “leading term” in an extrapolation for a parameter of interest in an ex ante evaluation.
Download InfoIf you experience problems downloading a file, check if you have the proper application to view it first. In case of further problems read the IDEAS help page. Note that these files are not on the IDEAS site. Please be patient as the files may be large.
Bibliographic InfoPaper provided by National Bureau of Economic Research, Inc in its series NBER Working Papers with number 16016.
Date of creation: May 2010
Date of revision:
Publication status: published as “Program Evaluation and Research Designs” with John DiNard o, in Handbook of Labor Economics, Volume 4A , Orley Ashenfelter and David Card, ed., Elsevier B.V., 2011.
Note: LS PE
Contact details of provider:
Postal: National Bureau of Economic Research, 1050 Massachusetts Avenue Cambridge, MA 02138, U.S.A.
Web page: http://www.nber.org
More information through EDIRC
Other versions of this item:
- C10 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - General
- C50 - Mathematical and Quantitative Methods - - Econometric Modeling - - - General
- C52 - Mathematical and Quantitative Methods - - Econometric Modeling - - - Model Evaluation, Validation, and Selection
- H00 - Public Economics - - General - - - General
- I00 - Health, Education, and Welfare - - General - - - General
- J00 - Labor and Demographic Economics - - General - - - General
- J24 - Labor and Demographic Economics - - Demand and Supply of Labor - - - Human Capital; Skills; Occupational Choice; Labor Productivity
You can help add them by filling out this form.
CitEc Project, subscribe to its RSS feed for this item.
- Matthew D. Webb, 2013. "Reworking Wild Bootstrap Based Inference for Clustered Errors," Working Papers 1315, Queen's University, Department of Economics.
- David E. Card & Pablo Ibarraran & Juan Miguel Villa, 2011.
"Building in an Evaluation Component for Active Labor Market Programs: A Practitioner's Guide,"
SPD Working Papers
1101, Inter-American Development Bank, Office of Strategic Planning and Development Effectiveness (SPD).
- Card, David & Ibarrarán, Pablo & Villa, Juan Miguel, 2011. "Building in an Evaluation Component for Active Labor Market Programs: A Practitioner's Guide," IZA Discussion Papers 6085, Institute for the Study of Labor (IZA).
- Doyle, Joseph J., 2013. "Causal effects of foster care: An instrumental-variables approach," Children and Youth Services Review, Elsevier, vol. 35(7), pages 1143-1151.
- Jens Ludwig & Jeffrey R. Kling & Sendhil Mullainathan, 2011.
"Mechanism Experiments and Policy Evaluations,"
Journal of Economic Perspectives,
American Economic Association, vol. 25(3), pages 17-38, Summer.
- Arni, Patrick, 2012. "Kausale Evaluation von Pilotprojekten: Die Nutzung von Randomisierung in der Praxis," IZA Standpunkte 52, Institute for the Study of Labor (IZA).
- Nicholas Bloom & John Van Reenen, 2010.
"Human Resource Management and Productivity,"
CEP Discussion Papers
dp0982, Centre for Economic Performance, LSE.
- Nick Bloom & John Van Reenen, 2010. "Human resource management and productivity," LSE Research Online Documents on Economics 28730, London School of Economics and Political Science, LSE Library.
- Bloom, Nicholas & Van Reenen, John, 2010. "Human Resource Management and Productivity," CEPR Discussion Papers 7849, C.E.P.R. Discussion Papers.
- Nicholas Bloom & John Van Reenen, 2010. "Human Resource Management and Productivity," NBER Working Papers 16019, National Bureau of Economic Research, Inc.
- Harding, Matthew & Lamarche, Carlos, 2014. "Estimating and testing a quantile regression model with interactive effects," Journal of Econometrics, Elsevier, vol. 178(P1), pages 101-113.
- de Chaisemartin, Clement, 2013. "Defying the LATE? Identification of local treatment effects when the instrument violates monotonicity," The Warwick Economics Research Paper Series (TWERPS) 1020, University of Warwick, Department of Economics.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: ().
If references are entirely missing, you can add them using this form.