IDEAS home Printed from https://ideas.repec.org/p/zbw/glodps/336.html
   My bibliography  Save this paper

What Is the Value Added by Using Causal Machine Learning Methods in a Welfare Experiment Evaluation?

Author

Listed:
  • Strittmatter, Anthony

Abstract

Recent studies have proposed causal machine learning (CML) methods to estimate conditional average treatment effects (CATEs). In this study, I investigate whether CML methods add value compared to conventional CATE estimators by re-evaluating Connecticut's Jobs First welfare experiment. This experiment entails a mix of positive and negative work incentives. Previous studies show that it is hard to tackle the effect heterogeneity of Jobs First by means of CATEs. I report evidence that CML methods can provide support for the theoretical labor supply predictions. Furthermore, I document reasons why some conventional CATE estimators fail and discuss the limitations of CML methods.

Suggested Citation

  • Strittmatter, Anthony, 2019. "What Is the Value Added by Using Causal Machine Learning Methods in a Welfare Experiment Evaluation?," GLO Discussion Paper Series 336, Global Labor Organization (GLO).
  • Handle: RePEc:zbw:glodps:336
    as

    Download full text from publisher

    File URL: https://www.econstor.eu/bitstream/10419/194352/1/GLO-DP-0336.pdf
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Sokbae Lee & Ryo Okui & Yoon†Jae Whang, 2017. "Doubly robust uniform confidence band for the conditional average treatment effect function," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 32(7), pages 1207-1225, November.
    2. Marianne P. Bitler & Jonah B. Gelbach & Hilary W. Hoynes, 2006. "What Mean Impacts Miss: Distributional Effects of Welfare Reform Experiments," American Economic Review, American Economic Association, vol. 96(4), pages 988-1012, September.
    3. Patrick Kline & Melissa Tartari, 2016. "Bounding the Labor Supply Responses to a Randomized Welfare Experiment: A Revealed Preference Approach," American Economic Review, American Economic Association, vol. 106(4), pages 972-1014, April.
    4. Sergio Firpo, 2007. "Efficient Semiparametric Estimation of Quantile Treatment Effects," Econometrica, Econometric Society, vol. 75(1), pages 259-276, January.
    5. Victor Chernozhukov & Denis Chetverikov & Mert Demirer & Esther Duflo & Christian Hansen & Whitney Newey & James Robins, 2018. "Double/debiased machine learning for treatment and structural parameters," Econometrics Journal, Royal Economic Society, vol. 21(1), pages 1-68, February.
    6. Alexandre Belloni & Victor Chernozhukov & Christian Hansen, 2014. "High-Dimensional Methods and Inference on Structural and Treatment Effects," Journal of Economic Perspectives, American Economic Association, vol. 28(2), pages 29-50, Spring.
    7. Garry F. Barrett & Stephen G. Donald, 2003. "Consistent Tests for Stochastic Dominance," Econometrica, Econometric Society, vol. 71(1), pages 71-104, January.
    8. Victor Chernozhukov & Mert Demirer & Esther Duflo & Iván Fernández-Val, 2018. "Generic Machine Learning Inference on Heterogeneous Treatment Effects in Randomized Experiments, with an Application to Immunization in India," NBER Working Papers 24678, National Bureau of Economic Research, Inc.
    9. James J. Heckman & Jeffrey Smith & Nancy Clements, 1997. "Making The Most Out Of Programme Evaluations and Social Experiments: Accounting For Heterogeneity in Programme Impacts," Review of Economic Studies, Oxford University Press, vol. 64(4), pages 487-535.
    10. Shuai Chen & Lu Tian & Tianxi Cai & Menggang Yu, 2017. "A general statistical framework for subgroup identification and comparative treatment scoring," Biometrics, The International Biometric Society, vol. 73(4), pages 1199-1209, December.
    11. Victor Chernozhukov & Iván Fernández‐Val & Ye Luo, 2018. "The Sorted Effects Method: Discovering Heterogeneous Effects Beyond Their Averages," Econometrica, Econometric Society, vol. 86(6), pages 1911-1938, November.
    12. Stefan Wager & Susan Athey, 2018. "Estimation and Inference of Heterogeneous Treatment Effects using Random Forests," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 113(523), pages 1228-1242, July.
    13. Michael Lechner & Anthony Strittmatter, 2019. "Practical procedures to deal with common support problems in matching estimation," Econometric Reviews, Taylor & Francis Journals, vol. 38(2), pages 193-207, February.
    14. Xinkun Nie & Stefan Wager, 2017. "Quasi-Oracle Estimation of Heterogeneous Treatment Effects," Papers 1712.04912, arXiv.org, revised Aug 2020.
    15. Alberto Abadie & Matthew M. Chingos & Martin R. West, 2018. "Endogenous Stratification in Randomized Experiments," The Review of Economics and Statistics, MIT Press, vol. 100(4), pages 567-580, October.
    16. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    17. Michael C. Knaus & Michael Lechner & Anthony Strittmatter, 2022. "Heterogeneous Employment Effects of Job Search Programs: A Machine Learning Approach," Journal of Human Resources, University of Wisconsin Press, vol. 57(2), pages 597-636.
    18. Susan Athey, 2018. "The Impact of Machine Learning on Economics," NBER Chapters, in: The Economics of Artificial Intelligence: An Agenda, pages 507-547, National Bureau of Economic Research, Inc.
    19. Jonathan M.V. Davis & Sara B. Heller, 2017. "Using Causal Forests to Predict Treatment Heterogeneity: An Application to Summer Jobs," American Economic Review, American Economic Association, vol. 107(5), pages 546-550, May.
    20. Victor Chernozhukov & Mert Demirer & Esther Duflo & Ivan Fernandez-Val, 2017. "Generic machine learning inference on heterogenous treatment effects in randomized experiments," CeMMAP working papers CWP61/17, Centre for Microdata Methods and Practice, Institute for Fiscal Studies.
    21. Anderson, Gordon, 1996. "Nonparametric Tests of Stochastic Dominance in Income Distributions," Econometrica, Econometric Society, vol. 64(5), pages 1183-1193, September.
    22. Matt Taddy & Matt Gardner & Liyun Chen & David Draper, 2016. "A Nonparametric Bayesian Analysis of Heterogenous Treatment Effects in Digital Experimentation," Journal of Business & Economic Statistics, Taylor & Francis Journals, vol. 34(4), pages 661-672, October.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Lechner, Michael, 2018. "Modified Causal Forests for Estimating Heterogeneous Causal Effects," IZA Discussion Papers 12040, Institute of Labor Economics (IZA).
    2. Anna Baiardi & Andrea A. Naghi, 2021. "The Value Added of Machine Learning to Causal Inference: Evidence from Revisited Studies," Tinbergen Institute Discussion Papers 21-001/V, Tinbergen Institute.
    3. Knaus, Michael C. & Lechner, Michael & Strittmatter, Anthony, 2018. "Machine Learning Estimation of Heterogeneous Causal Effects: Empirical Monte Carlo Evidence," IZA Discussion Papers 12039, Institute of Labor Economics (IZA).
    4. Anna Baiardi & Andrea A. Naghi, 2021. "The Value Added of Machine Learning to Causal Inference: Evidence from Revisited Studies," Papers 2101.00878, arXiv.org.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Michael C. Knaus & Michael Lechner & Anthony Strittmatter, 2018. "Machine Learning Estimation of Heterogeneous Causal Effects: Empirical Monte Carlo Evidence," Papers 1810.13237, arXiv.org, revised Dec 2018.
    2. Lechner, Michael, 2018. "Modified Causal Forests for Estimating Heterogeneous Causal Effects," IZA Discussion Papers 12040, Institute of Labor Economics (IZA).
    3. Michael C. Knaus, 2021. "A double machine learning approach to estimate the effects of musical practice on student’s skills," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 184(1), pages 282-300, January.
    4. Michael C. Knaus, 2020. "Double Machine Learning based Program Evaluation under Unconfoundedness," Papers 2003.03191, arXiv.org, revised Jun 2022.
    5. Daniel Goller & Tamara Harrer & Michael Lechner & Joachim Wolff, 2021. "Active labour market policies for the long-term unemployed: New evidence from causal machine learning," Papers 2106.10141, arXiv.org.
    6. Smith, Jeffrey A., 2022. "Treatment Effect Heterogeneity," IZA Discussion Papers 15151, Institute of Labor Economics (IZA).
    7. Nathan Kallus, 2022. "Treatment Effect Risk: Bounds and Inference," Papers 2201.05893, arXiv.org, revised Jul 2022.
    8. Davide Viviano & Jelena Bradic, 2019. "Synthetic learner: model-free inference on treatments over time," Papers 1904.01490, arXiv.org, revised Aug 2022.
    9. Michael C. Knaus & Michael Lechner & Anthony Strittmatter, 2022. "Heterogeneous Employment Effects of Job Search Programs: A Machine Learning Approach," Journal of Human Resources, University of Wisconsin Press, vol. 57(2), pages 597-636.
    10. Sokbae Lee & Yoon-Jae Whang, 2009. "Nonparametric Tests of Conditional Treatment Effects," Cowles Foundation Discussion Papers 1740, Cowles Foundation for Research in Economics, Yale University.
    11. Callaway, Brantly, 2021. "Bounds on distributional treatment effect parameters using panel data with an application on job displacement," Journal of Econometrics, Elsevier, vol. 222(2), pages 861-881.
    12. Gabriel Okasa, 2022. "Meta-Learners for Estimation of Causal Effects: Finite Sample Cross-Fit Performance," Papers 2201.12692, arXiv.org.
    13. Sant’Anna, Pedro H.C. & Zhao, Jun, 2020. "Doubly robust difference-in-differences estimators," Journal of Econometrics, Elsevier, vol. 219(1), pages 101-122.
    14. Cockx, Bart & Lechner, Michael & Bollens, Joost, 2019. "Priority to Unemployed Immigrants? A Causal Machine Learning Evaluation of Training in Belgium," IZA Discussion Papers 12875, Institute of Labor Economics (IZA).
    15. Buhl-Wiggers, Julie & Kerwin, Jason & Muñoz, Juan Sebastián & Smith, Jeffrey A. & Thornton, Rebecca L., 2020. "Some Children Left Behind: Variation in the Effects of an Educational Intervention," IZA Discussion Papers 13598, Institute of Labor Economics (IZA).
    16. Burgess, Simon & Metcalfe, Robert & Sadoff, Sally, 2021. "Understanding the response to financial and non-financial incentives in education: Field experimental evidence using high-stakes assessments," Economics of Education Review, Elsevier, vol. 85(C).
    17. Daniel Boller & Michael Lechner & Gabriel Okasa, 2021. "The Effect of Sport in Online Dating: Evidence from Causal Machine Learning," Papers 2104.04601, arXiv.org.
    18. Rahul Singh & Liyuan Xu & Arthur Gretton, 2020. "Kernel Methods for Causal Functions: Dose, Heterogeneous, and Incremental Response Curves," Papers 2010.04855, arXiv.org, revised Aug 2022.
    19. Donald, Stephen G. & Hsu, Yu-Chin, 2014. "Estimation and inference for distribution functions and quantile functions in treatment effect models," Journal of Econometrics, Elsevier, vol. 178(P3), pages 383-397.
    20. Hartley, Robert Paul & Lamarche, Carlos, 2018. "Behavioral responses and welfare reform: Evidence from a randomized experiment," Labour Economics, Elsevier, vol. 54(C), pages 135-151.

    More about this item

    Keywords

    Labor supply; individualized treatment effects; conditional average treatment effects; random forest;
    All these keywords.

    JEL classification:

    • H75 - Public Economics - - State and Local Government; Intergovernmental Relations - - - State and Local Government: Health, Education, and Welfare
    • I38 - Health, Education, and Welfare - - Welfare, Well-Being, and Poverty - - - Government Programs; Provision and Effects of Welfare Programs
    • J22 - Labor and Demographic Economics - - Demand and Supply of Labor - - - Time Allocation and Labor Supply
    • J31 - Labor and Demographic Economics - - Wages, Compensation, and Labor Costs - - - Wage Level and Structure; Wage Differentials
    • C21 - Mathematical and Quantitative Methods - - Single Equation Models; Single Variables - - - Cross-Sectional Models; Spatial Models; Treatment Effect Models

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:zbw:glodps:336. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: . General contact details of provider: https://edirc.repec.org/data/glabode.html .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: ZBW - Leibniz Information Centre for Economics (email available below). General contact details of provider: https://edirc.repec.org/data/glabode.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.