IDEAS home Printed from https://ideas.repec.org/p/iza/izadps/dp10108.html
   My bibliography  Save this paper

Viewpoint: Estimating the Causal Effects of Policies and Programs

Author

Listed:
  • Smith, Jeffrey A.

    (University of Wisconsin-Madison)

  • Sweetman, Arthur

    (McMaster University)

Abstract

Estimation, inference and interpretation of the causal effects of programs and policies have all advanced dramatically over the past 25 years. We highlight three particularly important intellectual trends: an improved appreciation of the substantive importance of heterogeneous responses and of their methodological implications, a stronger focus on internal validity brought about by the "credibility revolution," and the scientific value that follows from grounding estimation and interpretation in economic theory. We discuss a menu of commonly employed partial equilibrium approaches to the identification of causal effects, emphasizing that the researcher's central intellectual contribution always consists of making an explicit case for a specific causal interpretation given the relevant economic theory, the data, the institutional context and the economic question of interest. We also touch on the importance of general equilibrium effects and full cost-benefit analyses.

Suggested Citation

  • Smith, Jeffrey A. & Sweetman, Arthur, 2016. "Viewpoint: Estimating the Causal Effects of Policies and Programs," IZA Discussion Papers 10108, Institute of Labor Economics (IZA).
  • Handle: RePEc:iza:izadps:dp10108
    as

    Download full text from publisher

    File URL: https://docs.iza.org/dp10108.pdf
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Richard K. Crump & V. Joseph Hotz & Guido W. Imbens & Oscar A. Mitnik, 2009. "Dealing with limited overlap in estimation of average treatment effects," Biometrika, Biometrika Trust, vol. 96(1), pages 187-199.
    2. Bruno Crépon & Esther Duflo & Marc Gurgand & Roland Rathelot & Philippe Zamora, 2013. "Do Labor Market Policies have Displacement Effects? Evidence from a Clustered Randomized Experiment," The Quarterly Journal of Economics, Oxford University Press, vol. 128(2), pages 531-580.
    3. MacKinnon, James G. & Webb, Matthew D., 2020. "Randomization inference for difference-in-differences with few treated clusters," Journal of Econometrics, Elsevier, vol. 218(2), pages 435-450.
    4. LaLonde, Robert J, 1986. "Evaluating the Econometric Evaluations of Training Programs with Experimental Data," American Economic Review, American Economic Association, vol. 76(4), pages 604-620, September.
    5. Angus Deaton, 2010. "Instruments, Randomization, and Learning about Development," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 424-455, June.
    6. Arthur Lewbel & Yingying Dong & Thomas Tao Yang, 2012. "Comparing features of convenient estimators for binary choice models with endogenous regressors," Canadian Journal of Economics/Revue canadienne d'économique, John Wiley & Sons, vol. 45(3), pages 809-829, August.
    7. repec:mpr:mprres:6372 is not listed on IDEAS
    8. A. Smith, Jeffrey & E. Todd, Petra, 2005. "Does matching overcome LaLonde's critique of nonexperimental estimators?," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 305-353.
    9. Joseph G. Altonji & Todd E. Elder & Christopher R. Taber, 2005. "Selection on Observed and Unobserved Variables: Assessing the Effectiveness of Catholic Schools," Journal of Political Economy, University of Chicago Press, vol. 113(1), pages 151-184, February.
    10. James J. Heckman, 2010. "Building Bridges between Structural and Program Evaluation Approaches to Evaluating Policy," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 356-398, June.
    11. David E. Card & Pablo Ibarraran & Juan Miguel Villa, 2011. "Building in an Evaluation Component for Active Labor Market Programs: A Practitioner's Guide," SPD Working Papers 1101, Inter-American Development Bank, Office of Strategic Planning and Development Effectiveness (SPD).
    12. Caliendo, Marco & Mahlstedt, Robert & Mitnik, Oscar A., 2017. "Unobservable, but unimportant? The relevance of usually unobserved variables for the evaluation of labor market policies," Labour Economics, Elsevier, vol. 46(C), pages 14-25.
    13. Andrew Leigh, 2009. "What evidence should social policymakers use?," Economic Roundup, The Treasury, Australian Government, issue 1, pages 27-43, March.
    14. Andrea Ichino & Fabrizia Mealli & Tommaso Nannicini, 2008. "From temporary help jobs to permanent employment: what can we learn from matching estimators and their sensitivity?," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 23(3), pages 305-327.
    15. Heckman, James J & Lochner, Lance & Taber, Christopher, 1998. "General-Equilibrium Treatment Effects: A Study of Tuition Policy," American Economic Review, American Economic Association, vol. 88(2), pages 381-386, May.
    16. David S. Lee & Thomas Lemieux, 2010. "Regression Discontinuity Designs in Economics," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 281-355, June.
    17. James J. Heckman & Edward Vytlacil, 2005. "Structural Equations, Treatment Effects, and Econometric Policy Evaluation," Econometrica, Econometric Society, vol. 73(3), pages 669-738, May.
    18. Greenberg, David H. & Robins, Philip K., 2008. "Incorporating nonmarket time into benefit-cost analyses of social programs: An application to the self-sufficiency project," Journal of Public Economics, Elsevier, vol. 92(3-4), pages 766-794, April.
    19. Davidson, Carl & Woodbury, Stephen A, 1993. "The Displacement Effect of Reemployment Bonus Programs," Journal of Labor Economics, University of Chicago Press, vol. 11(4), pages 575-605, October.
    20. William P. Warburton & Rebecca N. Warburton & Arthur Sweetman & Clyde Hertzman, 2014. "The Impact of Placing Adolescent Males into Foster Care on Education, Income Assistance, and Convictions," Canadian Journal of Economics, Canadian Economics Association, vol. 47(1), pages 35-69, February.
    21. Black, Dan A. & Smith, J.A.Jeffrey A., 2004. "How robust is the evidence on the effects of college quality? Evidence from matching," Journal of Econometrics, Elsevier, vol. 121(1-2), pages 99-124.
    22. Matias Busso & John DiNardo & Justin McCrary, 2014. "New Evidence on the Finite Sample Properties of Propensity Score Reweighting and Matching Estimators," The Review of Economics and Statistics, MIT Press, vol. 96(5), pages 885-897, December.
    23. Patrick Kline & Christopher R. Walters, 2016. "Evaluating Public Programs with Close Substitutes: The Case of HeadStart," The Quarterly Journal of Economics, Oxford University Press, vol. 131(4), pages 1795-1848.
    24. Wilbert Van Der Klaauw, 2008. "Regression–Discontinuity Analysis: A Survey of Recent Developments in Economics," LABOUR, CEIS, vol. 22(2), pages 219-245, June.
    25. Riddell, Chris & Riddell, W. Craig, 2014. "The pitfalls of work requirements in welfare-to-work policies: Experimental evidence on human capital accumulation in the Self-Sufficiency Project," Journal of Public Economics, Elsevier, vol. 117(C), pages 39-49.
    26. Pedro Carneiro & James J. Heckman & Edward J. Vytlacil, 2011. "Estimating Marginal Returns to Education," American Economic Review, American Economic Association, vol. 101(6), pages 2754-2781, October.
    27. Milligan, Kevin & Stabile, Mark, 2007. "The integration of child tax credits and welfare: Evidence from the Canadian National Child Benefit program," Journal of Public Economics, Elsevier, vol. 91(1-2), pages 305-326, February.
    28. Lechner, Michael & Smith, Jeffrey, 2007. "What is the value added by caseworkers?," Labour Economics, Elsevier, vol. 14(2), pages 135-151, April.
    29. James Heckman & Justin L. Tobias & Edward Vytlacil, 2001. "Four Parameters of Interest in the Evaluation of Social Programs," Southern Economic Journal, John Wiley & Sons, vol. 68(2), pages 210-223, October.
    30. Hum, Derek & Simpson, Wayne, 1993. "Economic Response to a Guaranteed Annual Income: Experience from Canada and the United States," Journal of Labor Economics, University of Chicago Press, vol. 11(1), pages 263-296, January.
    31. Peter Z. Schochet, "undated". "Technical Methods Report: Statistical Power for Regression Discontinuity Designs in Education Evaluations," Mathematica Policy Research Reports 61fb6c057561451a8a6074508, Mathematica Policy Research.
    32. Heckman, James J. & Urzúa, Sergio, 2010. "Comparing IV with structural models: What simple IV can and cannot identify," Journal of Econometrics, Elsevier, vol. 156(1), pages 27-37, May.
    33. James Heckman & Neil Hohmann & Jeffrey Smith & Michael Khoo, 2000. "Substitution and Dropout Bias in Social Experiments: A Study of an Influential Social Experiment," The Quarterly Journal of Economics, Oxford University Press, vol. 115(2), pages 651-694.
    34. Rajeev H. Dehejia & Sadek Wahba, 2002. "Propensity Score-Matching Methods For Nonexperimental Causal Studies," The Review of Economics and Statistics, MIT Press, vol. 84(1), pages 151-161, February.
    35. Cook, Thomas D., 2008. ""Waiting for Life to Arrive": A history of the regression-discontinuity design in Psychology, Statistics and Economics," Journal of Econometrics, Elsevier, vol. 142(2), pages 636-654, February.
    36. Boris Kralj & Jasmin Kantarevic, 2013. "Quality and quantity in primary care mixed-payment models: evidence from family health organizations in Ontario," Canadian Journal of Economics, Canadian Economics Association, vol. 46(1), pages 208-238, February.
    37. Joshua D. Angrist & Jörn-Steffen Pischke, 2009. "Mostly Harmless Econometrics: An Empiricist's Companion," Economics Books, Princeton University Press, edition 1, number 8769.
    38. Lemieux, Thomas & Milligan, Kevin, 2008. "Incentive effects of social assistance: A regression discontinuity approach," Journal of Econometrics, Elsevier, vol. 142(2), pages 807-828, February.
    39. Kenneth I. Wolpin & Petra E. Todd, 2006. "Assessing the Impact of a School Subsidy Program in Mexico: Using a Social Experiment to Validate a Dynamic Behavioral Model of Child Schooling and Fertility," American Economic Review, American Economic Association, vol. 96(5), pages 1384-1417, December.
    40. Djebbari, Habiba & Smith, Jeffrey, 2008. "Heterogeneous impacts in PROGRESA," Journal of Econometrics, Elsevier, vol. 145(1-2), pages 64-80, July.
    41. Manuela Angelucci & Giacomo De Giorgi, 2009. "Indirect Effects of an Aid Program: How Do Cash Transfers Affect Ineligibles' Consumption?," American Economic Review, American Economic Association, vol. 99(1), pages 486-508, March.
    42. Imbens, Guido W & Angrist, Joshua D, 1994. "Identification and Estimation of Local Average Treatment Effects," Econometrica, Econometric Society, vol. 62(2), pages 467-475, March.
    43. Stock, James H & Wright, Jonathan H & Yogo, Motohiro, 2002. "A Survey of Weak Instruments and Weak Identification in Generalized Method of Moments," Journal of Business & Economic Statistics, American Statistical Association, vol. 20(4), pages 518-529, October.
    44. Philip Oreopoulos, 2006. "The compelling effects of compulsory schooling: evidence from Canada," Canadian Journal of Economics, Canadian Economics Association, vol. 39(1), pages 22-52, February.
    45. William Wascher & David Neumark, 2000. "Minimum Wages and Employment: A Case Study of the Fast-Food Industry in New Jersey and Pennsylvania: Comment," American Economic Review, American Economic Association, vol. 90(5), pages 1362-1396, December.
    46. Andersson, Fredrik W. & Holzer, Harry J. & Lane, Julia & Rosenblum, David & Smith, Jeffrey A., 2013. "Does Federally-Funded Job Training Work? Nonexperimental Estimates of WIA Training Impacts Using Longitudinal Data on Workers and Firms," IZA Discussion Papers 7621, Institute of Labor Economics (IZA).
    47. Guido W. Imbens, 2015. "Matching Methods in Practice: Three Examples," Journal of Human Resources, University of Wisconsin Press, vol. 50(2), pages 373-419.
    48. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    49. A. Colin Cameron & Douglas L. Miller, 2015. "A Practitioner’s Guide to Cluster-Robust Inference," Journal of Human Resources, University of Wisconsin Press, vol. 50(2), pages 317-372.
    50. Bev Dahlby, 2008. "The Marginal Cost of Public Funds: Theory and Applications," MIT Press Books, The MIT Press, edition 1, volume 1, number 0262042509, December.
    51. Marianne Bertrand & Esther Duflo & Sendhil Mullainathan, 2004. "How Much Should We Trust Differences-In-Differences Estimates?," The Quarterly Journal of Economics, Oxford University Press, vol. 119(1), pages 249-275.
    52. Abhijit V. Banerjee & Esther Duflo, 2009. "The Experimental Approach to Development Economics," Annual Review of Economics, Annual Reviews, vol. 1(1), pages 151-178, May.
    53. James Heckman & Justin L. Tobias & Edward Vytlacil, 2001. "Four Parameters of Interest in the Evaluation of Social Programs," Southern Economic Journal, John Wiley & Sons, vol. 68(2), pages 210-223, October.
    54. Guido W. Imbens, 2010. "Better LATE Than Nothing: Some Comments on Deaton (2009) and Heckman and Urzua (2009)," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 399-423, June.
    55. Imbens, Guido W. & Lemieux, Thomas, 2008. "Regression discontinuity designs: A guide to practice," Journal of Econometrics, Elsevier, vol. 142(2), pages 615-635, February.
    56. Joseph Hotz, V. & Imbens, Guido W. & Mortimer, Julie H., 2005. "Predicting the efficacy of future training programs using past experiences at other locations," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 241-270.
    57. Joseph J. Doyle Jr., 2008. "Child Protection and Adult Crime: Using Investigator Assignment to Estimate Causal Effects of Foster Care," Journal of Political Economy, University of Chicago Press, vol. 116(4), pages 746-770, August.
    58. David Greenberg & Mark Shroder & Matthew Onstott, 1999. "The Social Experiment Market," Journal of Economic Perspectives, American Economic Association, vol. 13(3), pages 157-172, Summer.
    59. James J. Heckman, 2001. "Micro Data, Heterogeneity, and the Evaluation of Public Policy: Nobel Lecture," Journal of Political Economy, University of Chicago Press, vol. 109(4), pages 673-748, August.
    60. Evelyn L. Forget, 2011. "The Town with No Poverty: The Health Effects of a Canadian Guaranteed Annual Income Field Experiment," Canadian Public Policy, University of Toronto Press, vol. 37(3), pages 283-305, September.
    61. repec:mpr:mprres:7217 is not listed on IDEAS
    62. Zhang, Xuelin & Morissette, Rene & Frenette, Marc, 2007. "Earnings Losses of Displaced Workers: Canadian Evidence from a Large Administrative Database on Firm Closures and Mass Layoffs," Analytical Studies Branch Research Paper Series 2007291e, Statistics Canada, Analytical Studies Branch.
    63. Arthur Lewbel & Yingying Dong & Thomas Tao Yang, 2012. "Viewpoint: Comparing features of convenient estimators for binary choice models with endogenous regressors," Canadian Journal of Economics, Canadian Economics Association, vol. 45(3), pages 809-829, August.
    64. Robert Moffitt, 1991. "Program Evaluation With Nonexperimental Data," Evaluation Review, , vol. 15(3), pages 291-314, June.
    65. repec:feb:artefa:0087 is not listed on IDEAS
    66. McCrary, Justin, 2008. "Manipulation of the running variable in the regression discontinuity design: A density test," Journal of Econometrics, Elsevier, vol. 142(2), pages 698-714, February.
    67. Richard Blundell & Lorraine Dearden & Barbara Sianesi, 2005. "Evaluating the effect of education on earnings: models, methods and results from the National Child Development Survey," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 168(3), pages 473-512, July.
    68. Heckman, James, 2013. "Sample selection bias as a specification error," Applied Econometrics, Russian Presidential Academy of National Economy and Public Administration (RANEPA), vol. 31(3), pages 129-137.
    69. Seán M. Muller, 2015. "Causal Interaction and External Validity: Obstacles to the Policy Relevance of Randomized Evaluations," World Bank Economic Review, World Bank Group, vol. 29(suppl_1), pages 217-225.
    70. Levitt, Steven D. & List, John A., 2009. "Field experiments in economics: The past, the present, and the future," European Economic Review, Elsevier, vol. 53(1), pages 1-18, January.
    71. Charles F. Manski, 2004. "Statistical Treatment Rules for Heterogeneous Populations," Econometrica, Econometric Society, vol. 72(4), pages 1221-1246, July.
    72. James Heckman & Hidehiko Ichimura & Jeffrey Smith & Petra Todd, 1998. "Characterizing Selection Bias Using Experimental Data," Econometrica, Econometric Society, vol. 66(5), pages 1017-1098, September.
    73. Joshua D. Angrist, 1998. "Estimating the Labor Market Impact of Voluntary Military Service Using Social Security Data on Military Applicants," Econometrica, Econometric Society, vol. 66(2), pages 249-288, March.
    74. Heckman, James J, 1996. "Randomization as an Instrumental Variable: Notes," The Review of Economics and Statistics, MIT Press, vol. 78(2), pages 336-341, May.
    75. Rebecca A. Maynard & Kenneth A. Couch & Coady Wing & Thomas D. Cook, 2013. "Strengthening The Regression Discontinuity Design Using Additional Design Elements: A Within‐Study Comparison," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 32(4), pages 853-877, September.
    76. Huber, Martin & Lechner, Michael & Wunsch, Conny, 2013. "The performance of estimators based on the propensity score," Journal of Econometrics, Elsevier, vol. 175(1), pages 1-21.
    77. A. D. Roy, 1951. "Some Thoughts On The Distribution Of Earnings," Oxford Economic Papers, Oxford University Press, vol. 3(2), pages 135-146.
    78. Boris Kralj & Jasmin Kantarevic, 2013. "Quality and quantity in primary care mixed‐payment models: evidence from family health organizations in Ontario," Canadian Journal of Economics/Revue canadienne d'économique, John Wiley & Sons, vol. 46(1), pages 208-238, February.
    79. Michael P. Murray, 2006. "Avoiding Invalid Instruments and Coping with Weak Instruments," Journal of Economic Perspectives, American Economic Association, vol. 20(4), pages 111-132, Fall.
    80. Imbens,Guido W. & Rubin,Donald B., 2015. "Causal Inference for Statistics, Social, and Biomedical Sciences," Cambridge Books, Cambridge University Press, number 9780521885881, November.
    81. Ho, Daniel E. & Imai, Kosuke & King, Gary & Stuart, Elizabeth A., 2007. "Matching as Nonparametric Preprocessing for Reducing Model Dependence in Parametric Causal Inference," Political Analysis, Cambridge University Press, vol. 15(3), pages 199-236, July.
    82. Bjorklund, Anders & Moffitt, Robert, 1987. "The Estimation of Wage Gains and Welfare Gains in Self-selection," The Review of Economics and Statistics, MIT Press, vol. 69(1), pages 42-49, February.
    83. Dan A. Black & Jeffrey A. Smith & Mark C. Berger & Brett J. Noel, 2003. "Is the Threat of Reemployment Services More Effective Than the Services Themselves? Evidence from Random Assignment in the UI System," American Economic Review, American Economic Association, vol. 93(4), pages 1313-1327, September.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Marc F. Bellemare, 2018. "Contract farming: opportunity cost and trade†offs," Agricultural Economics, International Association of Agricultural Economists, vol. 49(3), pages 279-288, May.
    2. Zhang, Xue & Sweetman, Arthur, 2018. "Blended capitation and incentives: Fee codes inside and outside the capitated basket," Journal of Health Economics, Elsevier, vol. 60(C), pages 16-29.
    3. Wang, Chao & Sweetman, Arthur, 2020. "Delisting eye examinations from public health insurance: Empirical evidence from Canada regarding impacts on patients and providers," Health Policy, Elsevier, vol. 124(5), pages 540-548.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    2. Huber, Martin, 2019. "An introduction to flexible methods for policy evaluation," FSES Working Papers 504, Faculty of Economics and Social Sciences, University of Freiburg/Fribourg Switzerland.
    3. Burt S. Barnow & Jeffrey Smith, 2015. "Employment and Training Programs," NBER Chapters, in: Economics of Means-Tested Transfer Programs in the United States, Volume 2, pages 127-234, National Bureau of Economic Research, Inc.
    4. Słoczyński, Tymon, 2012. "New Evidence on Linear Regression and Treatment Effect Heterogeneity," MPRA Paper 39524, University Library of Munich, Germany.
    5. Chad D. Meyerhoefer & Muzhe Yang, 2011. "The Relationship between Food Assistance and Health: A Review of the Literature and Empirical Strategies for Identifying Program Effects," Applied Economic Perspectives and Policy, Agricultural and Applied Economics Association, vol. 33(3), pages 304-344.
    6. Baum-Snow, Nathaniel & Ferreira, Fernando, 2015. "Causal Inference in Urban and Regional Economics," Handbook of Regional and Urban Economics, in: Gilles Duranton & J. V. Henderson & William C. Strange (ed.), Handbook of Regional and Urban Economics, edition 1, volume 5, chapter 0, pages 3-68, Elsevier.
    7. Caliendo, Marco & Mahlstedt, Robert & Mitnik, Oscar A., 2017. "Unobservable, but unimportant? The relevance of usually unobserved variables for the evaluation of labor market policies," Labour Economics, Elsevier, vol. 46(C), pages 14-25.
    8. Susan Athey & Guido W. Imbens, 2017. "The State of Applied Econometrics: Causality and Policy Evaluation," Journal of Economic Perspectives, American Economic Association, vol. 31(2), pages 3-32, Spring.
    9. Rothstein, Jesse & von Wachter, Till, 2016. "Social Experiments in the Labor Market," Institute for Research on Labor and Employment, Working Paper Series qt6605k20b, Institute of Industrial Relations, UC Berkeley.
    10. Ferman, Bruno, 2021. "Matching estimators with few treated and many control observations," Journal of Econometrics, Elsevier, vol. 225(2), pages 295-307.
    11. van der Klaauw, Bas, 2014. "From micro data to causality: Forty years of empirical labor economics," Labour Economics, Elsevier, vol. 30(C), pages 88-97.
    12. Dan A. Black & Joonhwi Joo & Robert LaLonde & Jeffrey Andrew Smith & Evan J. Taylor, 2017. "Simple Tests for Selection: Learning More from Instrumental Variables," CESifo Working Paper Series 6392, CESifo.
    13. Guido W. Imbens, 2010. "Better LATE Than Nothing: Some Comments on Deaton (2009) and Heckman and Urzua (2009)," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 399-423, June.
    14. Hunt Allcott, 2012. "Site Selection Bias in Program Evaluation," NBER Working Papers 18373, National Bureau of Economic Research, Inc.
    15. Deaton, Angus & Cartwright, Nancy, 2018. "Understanding and misunderstanding randomized controlled trials," Social Science & Medicine, Elsevier, vol. 210(C), pages 2-21.
    16. Rothstein, Jesse & von Wachter, Till, 2016. "Social Experiments in the Labor Market," Department of Economics, Working Paper Series qt6605k20b, Department of Economics, Institute for Business and Economic Research, UC Berkeley.
    17. Rothstein, J & von Wachter, T, 2016. "Social Experiments in the Labor Market," Department of Economics, Working Paper Series qt7957p9g6, Department of Economics, Institute for Business and Economic Research, UC Berkeley.
    18. James J. Heckman, 2005. "Micro Data, Heterogeneity and the Evaluation of Public Policy Part 2," The American Economist, Sage Publications, vol. 49(1), pages 16-44, March.
    19. A. Smith, Jeffrey & E. Todd, Petra, 2005. "Does matching overcome LaLonde's critique of nonexperimental estimators?," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 305-353.
    20. Flores, Carlos A. & Mitnik, Oscar A., 2009. "Evaluating Nonexperimental Estimators for Multiple Treatments: Evidence from Experimental Data," IZA Discussion Papers 4451, Institute of Labor Economics (IZA).

    More about this item

    Keywords

    causal effects; heterogeneous treatment effects; partial equilibrium identification;
    All these keywords.

    JEL classification:

    • C18 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Methodolical Issues: General
    • C21 - Mathematical and Quantitative Methods - - Single Equation Models; Single Variables - - - Cross-Sectional Models; Spatial Models; Treatment Effect Models
    • C26 - Mathematical and Quantitative Methods - - Single Equation Models; Single Variables - - - Instrumental Variables (IV) Estimation
    • C50 - Mathematical and Quantitative Methods - - Econometric Modeling - - - General
    • C90 - Mathematical and Quantitative Methods - - Design of Experiments - - - General

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:iza:izadps:dp10108. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: . General contact details of provider: https://edirc.repec.org/data/izaaade.html .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Holger Hinte (email available below). General contact details of provider: https://edirc.repec.org/data/izaaade.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.