IDEAS home Printed from https://ideas.repec.org/p/arx/papers/1812.06533.html
   My bibliography  Save this paper

What Is the Value Added by Using Causal Machine Learning Methods in a Welfare Experiment Evaluation?

Author

Listed:
  • Anthony Strittmatter

Abstract

Recent studies have proposed causal machine learning (CML) methods to estimate conditional average treatment effects (CATEs). In this study, I investigate whether CML methods add value compared to conventional CATE estimators by re-evaluating Connecticut's Jobs First welfare experiment. This experiment entails a mix of positive and negative work incentives. Previous studies show that it is hard to tackle the effect heterogeneity of Jobs First by means of CATEs. I report evidence that CML methods can provide support for the theoretical labor supply predictions. Furthermore, I document reasons why some conventional CATE estimators fail and discuss the limitations of CML methods.

Suggested Citation

  • Anthony Strittmatter, 2018. "What Is the Value Added by Using Causal Machine Learning Methods in a Welfare Experiment Evaluation?," Papers 1812.06533, arXiv.org, revised Dec 2021.
  • Handle: RePEc:arx:papers:1812.06533
    as

    Download full text from publisher

    File URL: http://arxiv.org/pdf/1812.06533
    File Function: Latest version
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Alexandre Belloni & Victor Chernozhukov & Christian Hansen, 2014. "High-Dimensional Methods and Inference on Structural and Treatment Effects," Journal of Economic Perspectives, American Economic Association, vol. 28(2), pages 29-50, Spring.
    2. Victor Chernozhukov & Iván Fernández‐Val & Ye Luo, 2018. "The Sorted Effects Method: Discovering Heterogeneous Effects Beyond Their Averages," Econometrica, Econometric Society, vol. 86(6), pages 1911-1938, November.
    3. Sokbae Lee & Ryo Okui & Yoon†Jae Whang, 2017. "Doubly robust uniform confidence band for the conditional average treatment effect function," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 32(7), pages 1207-1225, November.
    4. Marianne P. Bitler & Jonah B. Gelbach & Hilary W. Hoynes, 2006. "What Mean Impacts Miss: Distributional Effects of Welfare Reform Experiments," American Economic Review, American Economic Association, vol. 96(4), pages 988-1012, September.
    5. Patrick Kline & Melissa Tartari, 2016. "Bounding the Labor Supply Responses to a Randomized Welfare Experiment: A Revealed Preference Approach," American Economic Review, American Economic Association, vol. 106(4), pages 972-1014, April.
    6. Stefan Wager & Susan Athey, 2018. "Estimation and Inference of Heterogeneous Treatment Effects using Random Forests," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 113(523), pages 1228-1242, July.
    7. Sergio Firpo, 2007. "Efficient Semiparametric Estimation of Quantile Treatment Effects," Econometrica, Econometric Society, vol. 75(1), pages 259-276, January.
    8. Victor Chernozhukov & Denis Chetverikov & Mert Demirer & Esther Duflo & Christian Hansen & Whitney Newey & James Robins, 2018. "Double/debiased machine learning for treatment and structural parameters," Econometrics Journal, Royal Economic Society, vol. 21(1), pages 1-68, February.
    9. Alberto Abadie & Matthew M. Chingos & Martin R. West, 2018. "Endogenous Stratification in Randomized Experiments," The Review of Economics and Statistics, MIT Press, vol. 100(4), pages 567-580, October.
    10. Michael Lechner & Anthony Strittmatter, 2019. "Practical procedures to deal with common support problems in matching estimation," Econometric Reviews, Taylor & Francis Journals, vol. 38(2), pages 193-207, February.
    11. Garry F. Barrett & Stephen G. Donald, 2003. "Consistent Tests for Stochastic Dominance," Econometrica, Econometric Society, vol. 71(1), pages 71-104, January.
    12. Victor Chernozhukov & Mert Demirer & Esther Duflo & Iván Fernández-Val, 2018. "Generic Machine Learning Inference on Heterogeneous Treatment Effects in Randomized Experiments, with an Application to Immunization in India," NBER Working Papers 24678, National Bureau of Economic Research, Inc.
    13. James J. Heckman & Jeffrey Smith & Nancy Clements, 1997. "Making The Most Out Of Programme Evaluations and Social Experiments: Accounting For Heterogeneity in Programme Impacts," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 64(4), pages 487-535.
    14. Shuai Chen & Lu Tian & Tianxi Cai & Menggang Yu, 2017. "A general statistical framework for subgroup identification and comparative treatment scoring," Biometrics, The International Biometric Society, vol. 73(4), pages 1199-1209, December.
    15. Michael C. Knaus & Michael Lechner & Anthony Strittmatter, 2022. "Heterogeneous Employment Effects of Job Search Programs: A Machine Learning Approach," Journal of Human Resources, University of Wisconsin Press, vol. 57(2), pages 597-636.
    16. Victor Chernozhukov & Mert Demirer & Esther Duflo & Iv'an Fern'andez-Val, 2017. "Fisher-Schultz Lecture: Generic Machine Learning Inference on Heterogenous Treatment Effects in Randomized Experiments, with an Application to Immunization in India," Papers 1712.04802, arXiv.org, revised Oct 2023.
    17. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    18. Anderson, Gordon, 1996. "Nonparametric Tests of Stochastic Dominance in Income Distributions," Econometrica, Econometric Society, vol. 64(5), pages 1183-1193, September.
    19. Xinkun Nie & Stefan Wager, 2017. "Quasi-Oracle Estimation of Heterogeneous Treatment Effects," Papers 1712.04912, arXiv.org, revised Aug 2020.
    20. Matt Taddy & Matt Gardner & Liyun Chen & David Draper, 2016. "A Nonparametric Bayesian Analysis of Heterogenous Treatment Effects in Digital Experimentation," Journal of Business & Economic Statistics, Taylor & Francis Journals, vol. 34(4), pages 661-672, October.
    21. Susan Athey, 2018. "The Impact of Machine Learning on Economics," NBER Chapters, in: The Economics of Artificial Intelligence: An Agenda, pages 507-547, National Bureau of Economic Research, Inc.
    22. Jonathan M.V. Davis & Sara B. Heller, 2017. "Using Causal Forests to Predict Treatment Heterogeneity: An Application to Summer Jobs," American Economic Review, American Economic Association, vol. 107(5), pages 546-550, May.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Lechner, Michael, 2018. "Modified Causal Forests for Estimating Heterogeneous Causal Effects," IZA Discussion Papers 12040, Institute of Labor Economics (IZA).
    2. Michael C Knaus & Michael Lechner & Anthony Strittmatter, 2021. "Machine learning estimation of heterogeneous causal effects: Empirical Monte Carlo evidence," The Econometrics Journal, Royal Economic Society, vol. 24(1), pages 134-161.
    3. Anna Baiardi & Andrea A. Naghi, 2021. "The Value Added of Machine Learning to Causal Inference: Evidence from Revisited Studies," Tinbergen Institute Discussion Papers 21-001/V, Tinbergen Institute.
    4. Anna Baiardi & Andrea A. Naghi, 2021. "The Value Added of Machine Learning to Causal Inference: Evidence from Revisited Studies," Papers 2101.00878, arXiv.org.
    5. Quigley, David T. & Che, Yuyuan & Yasar, Mahmut & Rejesus, Roderick M., 2023. "Cover Crop Adoption and Climate Risks: An Application of Causal Random Forests," 2023 Annual Meeting, July 23-25, Washington D.C. 335586, Agricultural and Applied Economics Association.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Strittmatter, Anthony, 2023. "What is the value added by using causal machine learning methods in a welfare experiment evaluation?," Labour Economics, Elsevier, vol. 84(C).
    2. Michael C Knaus & Michael Lechner & Anthony Strittmatter, 2021. "Machine learning estimation of heterogeneous causal effects: Empirical Monte Carlo evidence," The Econometrics Journal, Royal Economic Society, vol. 24(1), pages 134-161.
    3. Lechner, Michael, 2018. "Modified Causal Forests for Estimating Heterogeneous Causal Effects," IZA Discussion Papers 12040, Institute of Labor Economics (IZA).
    4. Michael Lechner & Jana Mareckova, 2022. "Modified Causal Forest," Papers 2209.03744, arXiv.org.
    5. Michael C. Knaus, 2021. "A double machine learning approach to estimate the effects of musical practice on student’s skills," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 184(1), pages 282-300, January.
    6. Michael C Knaus, 2022. "Double machine learning-based programme evaluation under unconfoundedness [Econometric methods for program evaluation]," The Econometrics Journal, Royal Economic Society, vol. 25(3), pages 602-627.
    7. Goller, Daniel & Harrer, Tamara & Lechner, Michael & Wolff, Joachim, 2021. "Active labour market policies for the long-term unemployed: New evidence from causal machine learning," Economics Working Paper Series 2108, University of St. Gallen, School of Economics and Political Science.
    8. Nathan Kallus, 2022. "Treatment Effect Risk: Bounds and Inference," Papers 2201.05893, arXiv.org, revised Jul 2022.
    9. Jeffrey Smith, 2022. "Treatment Effect Heterogeneity," Evaluation Review, , vol. 46(5), pages 652-677, October.
    10. Davide Viviano & Jelena Bradic, 2019. "Synthetic learner: model-free inference on treatments over time," Papers 1904.01490, arXiv.org, revised Aug 2022.
    11. Nathan Kallus, 2023. "Treatment Effect Risk: Bounds and Inference," Management Science, INFORMS, vol. 69(8), pages 4579-4590, August.
    12. Callaway, Brantly, 2021. "Bounds on distributional treatment effect parameters using panel data with an application on job displacement," Journal of Econometrics, Elsevier, vol. 222(2), pages 861-881.
    13. Sasaki, Yuya & Ura, Takuya, 2023. "Estimation and inference for policy relevant treatment effects," Journal of Econometrics, Elsevier, vol. 234(2), pages 394-450.
    14. Michael C. Knaus & Michael Lechner & Anthony Strittmatter, 2022. "Heterogeneous Employment Effects of Job Search Programs: A Machine Learning Approach," Journal of Human Resources, University of Wisconsin Press, vol. 57(2), pages 597-636.
    15. Sokbae Lee & Yoon-Jae Whang, 2009. "Nonparametric Tests of Conditional Treatment Effects," Cowles Foundation Discussion Papers 1740, Cowles Foundation for Research in Economics, Yale University.
    16. Marianne Bertrand & Bruno Crépon & Alicia Marguerie & Patrick Premand, 2021. "Do Workfare Programs Live Up to Their Promises? Experimental Evidence from Cote D’Ivoire," NBER Working Papers 28664, National Bureau of Economic Research, Inc.
    17. Nora Bearth & Michael Lechner, 2024. "Causal Machine Learning for Moderation Effects," Papers 2401.08290, arXiv.org, revised Apr 2024.
    18. Gabriel Okasa, 2022. "Meta-Learners for Estimation of Causal Effects: Finite Sample Cross-Fit Performance," Papers 2201.12692, arXiv.org.
    19. Alexandre Belloni & Victor Chernozhukov & Denis Chetverikov & Christian Hansen & Kengo Kato, 2018. "High-dimensional econometrics and regularized GMM," CeMMAP working papers CWP35/18, Centre for Microdata Methods and Practice, Institute for Fiscal Studies.
    20. Nicolaj N. Mühlbach, 2020. "Tree-based Synthetic Control Methods: Consequences of moving the US Embassy," CREATES Research Papers 2020-04, Department of Economics and Business Economics, Aarhus University.

    More about this item

    JEL classification:

    • H75 - Public Economics - - State and Local Government; Intergovernmental Relations - - - State and Local Government: Health, Education, and Welfare
    • I38 - Health, Education, and Welfare - - Welfare, Well-Being, and Poverty - - - Government Programs; Provision and Effects of Welfare Programs
    • J22 - Labor and Demographic Economics - - Demand and Supply of Labor - - - Time Allocation and Labor Supply
    • J31 - Labor and Demographic Economics - - Wages, Compensation, and Labor Costs - - - Wage Level and Structure; Wage Differentials
    • C21 - Mathematical and Quantitative Methods - - Single Equation Models; Single Variables - - - Cross-Sectional Models; Spatial Models; Treatment Effect Models

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:arx:papers:1812.06533. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: arXiv administrators (email available below). General contact details of provider: http://arxiv.org/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.