IDEAS home Printed from https://ideas.repec.org/a/wly/canjec/v49y2016i3p871-905.html
   My bibliography  Save this article

Viewpoint: Estimating the causal effects of policies and programs

Author

Listed:
  • Jeffrey Smith
  • Arthur Sweetman

Abstract

Estimation, inference and interpretation of the causal effects of programs and policies have all advanced dramatically over the past 25 years. We highlight three particularly important intellectual trends: an improved appreciation of the substantive importance of heterogeneous responses and of their methodological implications, a stronger focus on internal validity brought about by the “credibility revolution,” and the scientific value that follows from grounding estimation and interpretation in economic theory. We discuss a menu of commonly employed partial equilibrium approaches to the identification of causal effects, emphasizing that the researcher's central intellectual contribution always consists of making an explicit case for a specific causal interpretation given the relevant economic theory, the data, the institutional context and the economic question of interest. We also touch on the importance of general equilibrium effects and full cost–benefit analyses. Point de vue: Sur l’estimation des effets causatifs des politiques et programmes. Dans le monde de l’estimation, l’inférence et l’interprétation des effets causatifs des programmes et des politiques, il y a eu des progrès dramatiques au cours des derniers 25 ans. Les auteurs soulignent trois tendances intellectuelles particulièrement importantes : une appréciation améliorée de l’importance substantielle des réponses hétérogènes et de leur importance méthodologique, une focalisation plus robuste sur la validité interne engendrée par la « révolution de la crédibilité », et la valeur scientifique qui découle d’un ancrage de l’estimation et de l’interprétation dans la théorie économique. On discute un éventail d’approches d’équilibre partiel à l’identification des effets causatifs, mettant au premier plan que la contribution intellectuelle centrale du chercheur consiste à bâtir un argumentaire explicite pour une interprétation causale spécifique compte tenu de la théorie économique pertinente, des données, du contexte institutionnel, et de la question économique d’intérêt. On mentionne aussi l’importance des effets d’équilibre général et des analyses de tous les coûts et avantages.

Suggested Citation

  • Jeffrey Smith & Arthur Sweetman, 2016. "Viewpoint: Estimating the causal effects of policies and programs," Canadian Journal of Economics/Revue canadienne d'économique, John Wiley & Sons, vol. 49(3), pages 871-905, August.
  • Handle: RePEc:wly:canjec:v:49:y:2016:i:3:p:871-905
    DOI: 10.1111/caje.12217
    as

    Download full text from publisher

    File URL: https://doi.org/10.1111/caje.12217
    Download Restriction: no

    File URL: https://libkey.io/10.1111/caje.12217?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Richard K. Crump & V. Joseph Hotz & Guido W. Imbens & Oscar A. Mitnik, 2009. "Dealing with limited overlap in estimation of average treatment effects," Biometrika, Biometrika Trust, vol. 96(1), pages 187-199.
    2. Bruno Crépon & Esther Duflo & Marc Gurgand & Roland Rathelot & Philippe Zamora, 2013. "Do Labor Market Policies have Displacement Effects? Evidence from a Clustered Randomized Experiment," The Quarterly Journal of Economics, Oxford University Press, vol. 128(2), pages 531-580.
    3. Cook, Thomas D., 2008. ""Waiting for Life to Arrive": A history of the regression-discontinuity design in Psychology, Statistics and Economics," Journal of Econometrics, Elsevier, vol. 142(2), pages 636-654, February.
    4. MacKinnon, James G. & Webb, Matthew D., 2020. "Randomization inference for difference-in-differences with few treated clusters," Journal of Econometrics, Elsevier, vol. 218(2), pages 435-450.
    5. James Heckman & Justin L. Tobias & Edward Vytlacil, 2001. "Four Parameters of Interest in the Evaluation of Social Programs," Southern Economic Journal, John Wiley & Sons, vol. 68(2), pages 210-223, October.
    6. LaLonde, Robert J, 1986. "Evaluating the Econometric Evaluations of Training Programs with Experimental Data," American Economic Review, American Economic Association, vol. 76(4), pages 604-620, September.
    7. Boris Kralj & Jasmin Kantarevic, 2013. "Quality and quantity in primary care mixed-payment models: evidence from family health organizations in Ontario," Canadian Journal of Economics, Canadian Economics Association, vol. 46(1), pages 208-238, February.
    8. Angus Deaton, 2010. "Instruments, Randomization, and Learning about Development," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 424-455, June.
    9. Philip Oreopoulos, 2006. "The compelling effects of compulsory schooling: evidence from Canada," Canadian Journal of Economics/Revue canadienne d'économique, John Wiley & Sons, vol. 39(1), pages 22-52, February.
    10. Levitt, Steven D. & List, John A., 2009. "Field experiments in economics: The past, the present, and the future," European Economic Review, Elsevier, vol. 53(1), pages 1-18, January.
    11. David Greenberg & Mark Shroder & Matthew Onstott, 1999. "The Social Experiment Market," Journal of Economic Perspectives, American Economic Association, vol. 13(3), pages 157-172, Summer.
    12. James J. Heckman, 2001. "Micro Data, Heterogeneity, and the Evaluation of Public Policy: Nobel Lecture," Journal of Political Economy, University of Chicago Press, vol. 109(4), pages 673-748, August.
    13. Evelyn L. Forget, 2011. "The Town with No Poverty: The Health Effects of a Canadian Guaranteed Annual Income Field Experiment," Canadian Public Policy, University of Toronto Press, vol. 37(3), pages 283-305, September.
    14. Arthur Lewbel & Yingying Dong & Thomas Tao Yang, 2012. "Comparing features of convenient estimators for binary choice models with endogenous regressors," Canadian Journal of Economics/Revue canadienne d'économique, John Wiley & Sons, vol. 45(3), pages 809-829, August.
    15. repec:mpr:mprres:6372 is not listed on IDEAS
    16. Rebecca A. Maynard & Kenneth A. Couch & Coady Wing & Thomas D. Cook, 2013. "Strengthening The Regression Discontinuity Design Using Additional Design Elements: A Within‐Study Comparison," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 32(4), pages 853-877, September.
    17. Joshua D. Angrist & Jörn-Steffen Pischke, 2009. "Mostly Harmless Econometrics: An Empiricist's Companion," Economics Books, Princeton University Press, edition 1, number 8769.
    18. repec:mpr:mprres:7217 is not listed on IDEAS
    19. Lemieux, Thomas & Milligan, Kevin, 2008. "Incentive effects of social assistance: A regression discontinuity approach," Journal of Econometrics, Elsevier, vol. 142(2), pages 807-828, February.
    20. Card, David & Ibarrarán, Pablo & Villa, Juan Miguel, 2011. "Building in an Evaluation Component for Active Labor Market Programs: A Practitioner's Guide," IZA Discussion Papers 6085, Institute of Labor Economics (IZA).
    21. James J. Heckman, 2010. "Building Bridges between Structural and Program Evaluation Approaches to Evaluating Policy," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 356-398, June.
    22. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    23. Kenneth I. Wolpin & Petra E. Todd, 2006. "Assessing the Impact of a School Subsidy Program in Mexico: Using a Social Experiment to Validate a Dynamic Behavioral Model of Child Schooling and Fertility," American Economic Review, American Economic Association, vol. 96(5), pages 1384-1417, December.
    24. Rajeev H. Dehejia & Sadek Wahba, 2002. "Propensity Score-Matching Methods For Nonexperimental Causal Studies," The Review of Economics and Statistics, MIT Press, vol. 84(1), pages 151-161, February.
    25. A. Smith, Jeffrey & E. Todd, Petra, 2005. "Does matching overcome LaLonde's critique of nonexperimental estimators?," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 305-353.
    26. Heckman, James J. & Urzúa, Sergio, 2010. "Comparing IV with structural models: What simple IV can and cannot identify," Journal of Econometrics, Elsevier, vol. 156(1), pages 27-37, May.
    27. Joseph G. Altonji & Todd E. Elder & Christopher R. Taber, 2005. "Selection on Observed and Unobserved Variables: Assessing the Effectiveness of Catholic Schools," Journal of Political Economy, University of Chicago Press, vol. 113(1), pages 151-184, February.
    28. Djebbari, Habiba & Smith, Jeffrey, 2008. "Heterogeneous impacts in PROGRESA," Journal of Econometrics, Elsevier, vol. 145(1-2), pages 64-80, July.
    29. Caliendo, Marco & Mahlstedt, Robert & Mitnik, Oscar A., 2017. "Unobservable, but unimportant? The relevance of usually unobserved variables for the evaluation of labor market policies," Labour Economics, Elsevier, vol. 46(C), pages 14-25.
    30. William P. Warburton & Rebecca N. Warburton & Arthur Sweetman & Clyde Hertzman, 2014. "The Impact of Placing Adolescent Males into Foster Care on Education, Income Assistance, and Convictions," Canadian Journal of Economics/Revue canadienne d'économique, John Wiley & Sons, vol. 47(1), pages 35-69, February.
    31. Andrew Leigh, 2009. "What evidence should social policymakers use?," Economic Roundup, The Treasury, Australian Government, issue 1, pages 27-43, March.
    32. Andrea Ichino & Fabrizia Mealli & Tommaso Nannicini, 2008. "From temporary help jobs to permanent employment: what can we learn from matching estimators and their sensitivity?," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 23(3), pages 305-327.
    33. Manuela Angelucci & Giacomo De Giorgi, 2009. "Indirect Effects of an Aid Program: How Do Cash Transfers Affect Ineligibles' Consumption?," American Economic Review, American Economic Association, vol. 99(1), pages 486-508, March.
    34. Heckman, James J & Lochner, Lance & Taber, Christopher, 1998. "General-Equilibrium Treatment Effects: A Study of Tuition Policy," American Economic Review, American Economic Association, vol. 88(2), pages 381-386, May.
    35. Pedro Carneiro & James J. Heckman & Edward J. Vytlacil, 2011. "Estimating Marginal Returns to Education," American Economic Review, American Economic Association, vol. 101(6), pages 2754-2781, October.
    36. Patrick Kline & Christopher R. Walters, 2016. "Evaluating Public Programs with Close Substitutes: The Case of HeadStart," The Quarterly Journal of Economics, Oxford University Press, vol. 131(4), pages 1795-1848.
    37. Guido W. Imbens, 2015. "Matching Methods in Practice: Three Examples," Journal of Human Resources, University of Wisconsin Press, vol. 50(2), pages 373-419.
    38. David S. Lee & Thomas Lemieux, 2010. "Regression Discontinuity Designs in Economics," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 281-355, June.
    39. Arthur Lewbel & Yingying Dong & Thomas Tao Yang, 2012. "Viewpoint: Comparing features of convenient estimators for binary choice models with endogenous regressors," Canadian Journal of Economics, Canadian Economics Association, vol. 45(3), pages 809-829, August.
    40. Marianne Bertrand & Esther Duflo & Sendhil Mullainathan, 2004. "How Much Should We Trust Differences-In-Differences Estimates?," The Quarterly Journal of Economics, Oxford University Press, vol. 119(1), pages 249-275.
    41. Imbens, Guido W & Angrist, Joshua D, 1994. "Identification and Estimation of Local Average Treatment Effects," Econometrica, Econometric Society, vol. 62(2), pages 467-475, March.
    42. Stock, James H & Wright, Jonathan H & Yogo, Motohiro, 2002. "A Survey of Weak Instruments and Weak Identification in Generalized Method of Moments," Journal of Business & Economic Statistics, American Statistical Association, vol. 20(4), pages 518-529, October.
    43. Lechner, Michael & Smith, Jeffrey, 2007. "What is the value added by caseworkers?," Labour Economics, Elsevier, vol. 14(2), pages 135-151, April.
    44. Imbens, Guido W. & Lemieux, Thomas, 2008. "Regression discontinuity designs: A guide to practice," Journal of Econometrics, Elsevier, vol. 142(2), pages 615-635, February.
    45. Greenberg, David H. & Robins, Philip K., 2008. "Incorporating nonmarket time into benefit-cost analyses of social programs: An application to the self-sufficiency project," Journal of Public Economics, Elsevier, vol. 92(3-4), pages 766-794, April.
    46. Huber, Martin & Lechner, Michael & Wunsch, Conny, 2013. "The performance of estimators based on the propensity score," Journal of Econometrics, Elsevier, vol. 175(1), pages 1-21.
    47. James J. Heckman & Edward Vytlacil, 2005. "Structural Equations, Treatment Effects, and Econometric Policy Evaluation," Econometrica, Econometric Society, vol. 73(3), pages 669-738, May.
    48. Andersson, Fredrik W. & Holzer, Harry J. & Lane, Julia & Rosenblum, David & Smith, Jeffrey A., 2013. "Does Federally-Funded Job Training Work? Nonexperimental Estimates of WIA Training Impacts Using Longitudinal Data on Workers and Firms," IZA Discussion Papers 7621, Institute of Labor Economics (IZA).
    49. Black, Dan A. & Smith, J.A.Jeffrey A., 2004. "How robust is the evidence on the effects of college quality? Evidence from matching," Journal of Econometrics, Elsevier, vol. 121(1-2), pages 99-124.
    50. A. D. Roy, 1951. "Some Thoughts On The Distribution Of Earnings," Oxford Economic Papers, Oxford University Press, vol. 3(2), pages 135-146.
    51. Davidson, Carl & Woodbury, Stephen A, 1993. "The Displacement Effect of Reemployment Bonus Programs," Journal of Labor Economics, University of Chicago Press, vol. 11(4), pages 575-605, October.
    52. William Wascher & David Neumark, 2000. "Minimum Wages and Employment: A Case Study of the Fast-Food Industry in New Jersey and Pennsylvania: Comment," American Economic Review, American Economic Association, vol. 90(5), pages 1362-1396, December.
    53. Guido W. Imbens, 2010. "Better LATE Than Nothing: Some Comments on Deaton (2009) and Heckman and Urzua (2009)," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 399-423, June.
    54. Boris Kralj & Jasmin Kantarevic, 2013. "Quality and quantity in primary care mixed‐payment models: evidence from family health organizations in Ontario," Canadian Journal of Economics/Revue canadienne d'économique, John Wiley & Sons, vol. 46(1), pages 208-238, February.
    55. Riddell, Chris & Riddell, W. Craig, 2014. "The pitfalls of work requirements in welfare-to-work policies: Experimental evidence on human capital accumulation in the Self-Sufficiency Project," Journal of Public Economics, Elsevier, vol. 117(C), pages 39-49.
    56. Robert Moffitt, 1991. "Program Evaluation With Nonexperimental Data," Evaluation Review, , vol. 15(3), pages 291-314, June.
    57. repec:feb:artefa:0087 is not listed on IDEAS
    58. James Heckman & Neil Hohmann & Jeffrey Smith & Michael Khoo, 2000. "Substitution and Dropout Bias in Social Experiments: A Study of an Influential Social Experiment," The Quarterly Journal of Economics, Oxford University Press, vol. 115(2), pages 651-694.
    59. Michael P. Murray, 2006. "Avoiding Invalid Instruments and Coping with Weak Instruments," Journal of Economic Perspectives, American Economic Association, vol. 20(4), pages 111-132, Fall.
    60. McCrary, Justin, 2008. "Manipulation of the running variable in the regression discontinuity design: A density test," Journal of Econometrics, Elsevier, vol. 142(2), pages 698-714, February.
    61. Matias Busso & John DiNardo & Justin McCrary, 2014. "New Evidence on the Finite Sample Properties of Propensity Score Reweighting and Matching Estimators," The Review of Economics and Statistics, MIT Press, vol. 96(5), pages 885-897, December.
    62. Richard Blundell & Lorraine Dearden & Barbara Sianesi, 2005. "Evaluating the effect of education on earnings: models, methods and results from the National Child Development Survey," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 168(3), pages 473-512, July.
    63. Abhijit V. Banerjee & Esther Duflo, 2009. "The Experimental Approach to Development Economics," Annual Review of Economics, Annual Reviews, vol. 1(1), pages 151-178, May.
    64. Heckman, James, 2013. "Sample selection bias as a specification error," Applied Econometrics, Russian Presidential Academy of National Economy and Public Administration (RANEPA), vol. 31(3), pages 129-137.
    65. A. Colin Cameron & Douglas L. Miller, 2015. "A Practitioner’s Guide to Cluster-Robust Inference," Journal of Human Resources, University of Wisconsin Press, vol. 50(2), pages 317-372.
    66. Imbens,Guido W. & Rubin,Donald B., 2015. "Causal Inference for Statistics, Social, and Biomedical Sciences," Cambridge Books, Cambridge University Press, number 9780521885881, November.
    67. Wilbert Van Der Klaauw, 2008. "Regression–Discontinuity Analysis: A Survey of Recent Developments in Economics," LABOUR, CEIS, vol. 22(2), pages 219-245, June.
    68. Ho, Daniel E. & Imai, Kosuke & King, Gary & Stuart, Elizabeth A., 2007. "Matching as Nonparametric Preprocessing for Reducing Model Dependence in Parametric Causal Inference," Political Analysis, Cambridge University Press, vol. 15(3), pages 199-236, July.
    69. James Heckman & Hidehiko Ichimura & Jeffrey Smith & Petra Todd, 1998. "Characterizing Selection Bias Using Experimental Data," Econometrica, Econometric Society, vol. 66(5), pages 1017-1098, September.
    70. Bev Dahlby, 2008. "The Marginal Cost of Public Funds: Theory and Applications," MIT Press Books, The MIT Press, edition 1, volume 1, number 0262042509, December.
    71. Milligan, Kevin & Stabile, Mark, 2007. "The integration of child tax credits and welfare: Evidence from the Canadian National Child Benefit program," Journal of Public Economics, Elsevier, vol. 91(1-2), pages 305-326, February.
    72. Seán M. Muller, 2015. "Causal Interaction and External Validity: Obstacles to the Policy Relevance of Randomized Evaluations," World Bank Economic Review, World Bank Group, vol. 29(suppl_1), pages 217-225.
    73. James Heckman & Justin L. Tobias & Edward Vytlacil, 2001. "Four Parameters of Interest in the Evaluation of Social Programs," Southern Economic Journal, John Wiley & Sons, vol. 68(2), pages 210-223, October.
    74. Charles F. Manski, 2004. "Statistical Treatment Rules for Heterogeneous Populations," Econometrica, Econometric Society, vol. 72(4), pages 1221-1246, July.
    75. Bjorklund, Anders & Moffitt, Robert, 1987. "The Estimation of Wage Gains and Welfare Gains in Self-selection," The Review of Economics and Statistics, MIT Press, vol. 69(1), pages 42-49, February.
    76. Hum, Derek & Simpson, Wayne, 1993. "Economic Response to a Guaranteed Annual Income: Experience from Canada and the United States," Journal of Labor Economics, University of Chicago Press, vol. 11(1), pages 263-296, January.
    77. Peter Z. Schochet, "undated". "Technical Methods Report: Statistical Power for Regression Discontinuity Designs in Education Evaluations," Mathematica Policy Research Reports 61fb6c057561451a8a6074508, Mathematica Policy Research.
    78. Dan A. Black & Jeffrey A. Smith & Mark C. Berger & Brett J. Noel, 2003. "Is the Threat of Reemployment Services More Effective Than the Services Themselves? Evidence from Random Assignment in the UI System," American Economic Review, American Economic Association, vol. 93(4), pages 1313-1327, September.
    79. Joshua D. Angrist, 1998. "Estimating the Labor Market Impact of Voluntary Military Service Using Social Security Data on Military Applicants," Econometrica, Econometric Society, vol. 66(2), pages 249-288, March.
    80. Joseph Hotz, V. & Imbens, Guido W. & Mortimer, Julie H., 2005. "Predicting the efficacy of future training programs using past experiences at other locations," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 241-270.
    81. Heckman, James J, 1996. "Randomization as an Instrumental Variable: Notes," The Review of Economics and Statistics, MIT Press, vol. 78(2), pages 336-341, May.
    82. Joseph J. Doyle Jr., 2008. "Child Protection and Adult Crime: Using Investigator Assignment to Estimate Causal Effects of Foster Care," Journal of Political Economy, University of Chicago Press, vol. 116(4), pages 746-770, August.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Wang, Chao & Sweetman, Arthur, 2020. "Delisting eye examinations from public health insurance: Empirical evidence from Canada regarding impacts on patients and providers," Health Policy, Elsevier, vol. 124(5), pages 540-548.
    2. Marc F. Bellemare, 2018. "Contract farming: opportunity cost and trade†offs," Agricultural Economics, International Association of Agricultural Economists, vol. 49(3), pages 279-288, May.
    3. Zhang, Xue & Sweetman, Arthur, 2018. "Blended capitation and incentives: Fee codes inside and outside the capitated basket," Journal of Health Economics, Elsevier, vol. 60(C), pages 16-29.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    2. Martin Huber, 2019. "An introduction to flexible methods for policy evaluation," Papers 1910.00641, arXiv.org.
    3. Burt S. Barnow & Jeffrey Smith, 2015. "Employment and Training Programs," NBER Chapters, in: Economics of Means-Tested Transfer Programs in the United States, Volume 2, pages 127-234, National Bureau of Economic Research, Inc.
    4. Słoczyński, Tymon, 2012. "New Evidence on Linear Regression and Treatment Effect Heterogeneity," MPRA Paper 39524, University Library of Munich, Germany.
    5. Baum-Snow, Nathaniel & Ferreira, Fernando, 2015. "Causal Inference in Urban and Regional Economics," Handbook of Regional and Urban Economics, in: Gilles Duranton & J. V. Henderson & William C. Strange (ed.), Handbook of Regional and Urban Economics, edition 1, volume 5, chapter 0, pages 3-68, Elsevier.
    6. Chad D. Meyerhoefer & Muzhe Yang, 2011. "The Relationship between Food Assistance and Health: A Review of the Literature and Empirical Strategies for Identifying Program Effects," Applied Economic Perspectives and Policy, Agricultural and Applied Economics Association, vol. 33(3), pages 304-344.
    7. Dan A. Black & Joonhwi Joo & Robert LaLonde & Jeffrey Andrew Smith & Evan J. Taylor, 2017. "Simple Tests for Selection: Learning More from Instrumental Variables," CESifo Working Paper Series 6392, CESifo.
    8. Guido W. Imbens, 2010. "Better LATE Than Nothing: Some Comments on Deaton (2009) and Heckman and Urzua (2009)," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 399-423, June.
    9. Hunt Allcott, 2012. "Site Selection Bias in Program Evaluation," NBER Working Papers 18373, National Bureau of Economic Research, Inc.
    10. Caliendo, Marco & Mahlstedt, Robert & Mitnik, Oscar A., 2017. "Unobservable, but unimportant? The relevance of usually unobserved variables for the evaluation of labor market policies," Labour Economics, Elsevier, vol. 46(C), pages 14-25.
    11. Deaton, Angus & Cartwright, Nancy, 2018. "Understanding and misunderstanding randomized controlled trials," Social Science & Medicine, Elsevier, vol. 210(C), pages 2-21.
    12. Susan Athey & Guido W. Imbens, 2017. "The State of Applied Econometrics: Causality and Policy Evaluation," Journal of Economic Perspectives, American Economic Association, vol. 31(2), pages 3-32, Spring.
    13. Ferman, Bruno, 2021. "Matching estimators with few treated and many control observations," Journal of Econometrics, Elsevier, vol. 225(2), pages 295-307.
    14. van der Klaauw, Bas, 2014. "From micro data to causality: Forty years of empirical labor economics," Labour Economics, Elsevier, vol. 30(C), pages 88-97.
    15. James J. Heckman, 2005. "Micro Data, Heterogeneity and the Evaluation of Public Policy Part 2," The American Economist, Sage Publications, vol. 49(1), pages 16-44, March.
    16. Carlos A. Flores & Oscar A. Mitnik, 2009. "Evaluating Nonexperimental Estimators for Multiple Treatments: Evidence from Experimental Data," Working Papers 2010-10, University of Miami, Department of Economics.
    17. A. Smith, Jeffrey & E. Todd, Petra, 2005. "Does matching overcome LaLonde's critique of nonexperimental estimators?," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 305-353.
    18. Huber, Martin & Lechner, Michael & Wunsch, Conny, 2013. "The performance of estimators based on the propensity score," Journal of Econometrics, Elsevier, vol. 175(1), pages 1-21.
    19. James J. Heckman, 2010. "Building Bridges between Structural and Program Evaluation Approaches to Evaluating Policy," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 356-398, June.
    20. Marco Caliendo & Steffen Künn, 2015. "Getting back into the labor market: the effects of start-up subsidies for unemployed females," Journal of Population Economics, Springer;European Society for Population Economics, vol. 28(4), pages 1005-1043, October.

    More about this item

    JEL classification:

    • C18 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Methodolical Issues: General
    • C21 - Mathematical and Quantitative Methods - - Single Equation Models; Single Variables - - - Cross-Sectional Models; Spatial Models; Treatment Effect Models
    • C26 - Mathematical and Quantitative Methods - - Single Equation Models; Single Variables - - - Instrumental Variables (IV) Estimation
    • C50 - Mathematical and Quantitative Methods - - Econometric Modeling - - - General
    • C90 - Mathematical and Quantitative Methods - - Design of Experiments - - - General

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:wly:canjec:v:49:y:2016:i:3:p:871-905. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: . General contact details of provider: https://doi.org/10.1111/(ISSN)1540-5982 .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Wiley Content Delivery (email available below). General contact details of provider: https://doi.org/10.1111/(ISSN)1540-5982 .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.