IDEAS home Printed from https://ideas.repec.org/p/arx/papers/1812.06533.html
   My bibliography  Save this paper

What Is the Value Added by Using Causal Machine Learning Methods in a Welfare Experiment Evaluation?

Author

Listed:
  • Anthony Strittmatter

Abstract

Recent studies have proposed causal machine learning (CML) methods to estimate conditional average treatment effects (CATEs). In this study, I investigate whether CML methods add value compared to conventional CATE estimators by re-evaluating Connecticut's Jobs First welfare experiment. This experiment entails a mix of positive and negative work incentives. Previous studies show that it is hard to tackle the effect heterogeneity of Jobs First by means of CATEs. I report evidence that CML methods can provide support for the theoretical labor supply predictions. Furthermore, I document reasons why some conventional CATE estimators fail and discuss the limitations of CML methods.

Suggested Citation

  • Anthony Strittmatter, 2018. "What Is the Value Added by Using Causal Machine Learning Methods in a Welfare Experiment Evaluation?," Papers 1812.06533, arXiv.org, revised Mar 2019.
  • Handle: RePEc:arx:papers:1812.06533
    as

    Download full text from publisher

    File URL: http://arxiv.org/pdf/1812.06533
    File Function: Latest version
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Victor Chernozhukov & Iván Fernández‐Val & Ye Luo, 2018. "The Sorted Effects Method: Discovering Heterogeneous Effects Beyond Their Averages," Econometrica, Econometric Society, vol. 86(6), pages 1911-1938, November.
    2. Sokbae Lee & Ryo Okui & Yoon†Jae Whang, 2017. "Doubly robust uniform confidence band for the conditional average treatment effect function," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 32(7), pages 1207-1225, November.
    3. Patrick Kline & Melissa Tartari, 2016. "Bounding the Labor Supply Responses to a Randomized Welfare Experiment: A Revealed Preference Approach," American Economic Review, American Economic Association, vol. 106(4), pages 972-1014, April.
    4. Victor Chernozhukov & Mert Demirer & Esther Duflo & Iv'an Fern'andez-Val, 2017. "Generic Machine Learning Inference on Heterogenous Treatment Effects in Randomized Experiments," Papers 1712.04802, arXiv.org, revised Dec 2020.
    5. Stefan Wager & Susan Athey, 2018. "Estimation and Inference of Heterogeneous Treatment Effects using Random Forests," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 113(523), pages 1228-1242, July.
    6. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    7. Xinkun Nie & Stefan Wager, 2017. "Quasi-Oracle Estimation of Heterogeneous Treatment Effects," Papers 1712.04912, arXiv.org, revised Aug 2020.
    8. Sergio Firpo, 2007. "Efficient Semiparametric Estimation of Quantile Treatment Effects," Econometrica, Econometric Society, vol. 75(1), pages 259-276, January.
    9. Victor Chernozhukov & Denis Chetverikov & Mert Demirer & Esther Duflo & Christian Hansen & Whitney Newey & James Robins, 2018. "Double/debiased machine learning for treatment and structural parameters," Econometrics Journal, Royal Economic Society, vol. 21(1), pages 1-68, February.
    10. Michael Lechner & Anthony Strittmatter, 2019. "Practical procedures to deal with common support problems in matching estimation," Econometric Reviews, Taylor & Francis Journals, vol. 38(2), pages 193-207, February.
    11. Alberto Abadie & Matthew M. Chingos & Martin R. West, 2018. "Endogenous Stratification in Randomized Experiments," The Review of Economics and Statistics, MIT Press, vol. 100(4), pages 567-580, October.
    12. Matt Taddy & Matt Gardner & Liyun Chen & David Draper, 2016. "A Nonparametric Bayesian Analysis of Heterogenous Treatment Effects in Digital Experimentation," Journal of Business & Economic Statistics, Taylor & Francis Journals, vol. 34(4), pages 661-672, October.
    13. Garry F. Barrett & Stephen G. Donald, 2003. "Consistent Tests for Stochastic Dominance," Econometrica, Econometric Society, vol. 71(1), pages 71-104, January.
    14. Victor Chernozhukov & Mert Demirer & Esther Duflo & Iván Fernández-Val, 2018. "Generic Machine Learning Inference on Heterogeneous Treatment Effects in Randomized Experiments, with an Application to Immunization in India," NBER Working Papers 24678, National Bureau of Economic Research, Inc.
    15. James J. Heckman & Jeffrey Smith & Nancy Clements, 1997. "Making The Most Out Of Programme Evaluations and Social Experiments: Accounting For Heterogeneity in Programme Impacts," Review of Economic Studies, Oxford University Press, vol. 64(4), pages 487-535.
    16. Shuai Chen & Lu Tian & Tianxi Cai & Menggang Yu, 2017. "A general statistical framework for subgroup identification and comparative treatment scoring," Biometrics, The International Biometric Society, vol. 73(4), pages 1199-1209, December.
    17. Michael Knaus & Michael Lechner & Anthony Strittmatter, 2017. "Heterogeneous Employment Effects of Job Search Programmes: A Machine Learning Approach," Papers 1709.10279, arXiv.org, revised May 2018.
    18. Jonathan M.V. Davis & Sara B. Heller, 2017. "Using Causal Forests to Predict Treatment Heterogeneity: An Application to Summer Jobs," American Economic Review, American Economic Association, vol. 107(5), pages 546-550, May.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Lechner, Michael, 2018. "Modified Causal Forests for Estimating Heterogeneous Causal Effects," IZA Discussion Papers 12040, Institute of Labor Economics (IZA).
    2. Michael C Knaus & Michael Lechner & Anthony Strittmatter, 2021. "Machine learning estimation of heterogeneous causal effects: Empirical Monte Carlo evidence," Econometrics Journal, Royal Economic Society, vol. 24(1), pages 134-161.
    3. Anna Baiardi & Andrea A. Naghi, 2021. "The Value Added of Machine Learning to Causal Inference: Evidence from Revisited Studies," Papers 2101.00878, arXiv.org.
    4. Anna Baiardi & Andrea A. Naghi, 2021. "The Value Added of Machine Learning to Causal Inference: Evidence from Revisited Studies," Tinbergen Institute Discussion Papers 21-001/V, Tinbergen Institute.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Lechner, Michael, 2018. "Modified Causal Forests for Estimating Heterogeneous Causal Effects," IZA Discussion Papers 12040, Institute of Labor Economics (IZA).
    2. Michael C Knaus & Michael Lechner & Anthony Strittmatter, 2021. "Machine learning estimation of heterogeneous causal effects: Empirical Monte Carlo evidence," Econometrics Journal, Royal Economic Society, vol. 24(1), pages 134-161.
    3. Knaus, Michael C., 2020. "Double Machine Learning based Program Evaluation under Unconfoundedness," Economics Working Paper Series 2004, University of St. Gallen, School of Economics and Political Science.
    4. Alexandre Belloni & Victor Chernozhukov & Denis Chetverikov & Christian Hansen & Kengo Kato, 2018. "High-dimensional econometrics and regularized GMM," CeMMAP working papers CWP35/18, Centre for Microdata Methods and Practice, Institute for Fiscal Studies.
    5. Nicolaj N. Mühlbach, 2020. "Tree-based Synthetic Control Methods: Consequences of moving the US Embassy," CREATES Research Papers 2020-04, Department of Economics and Business Economics, Aarhus University.
    6. Sant’Anna, Pedro H.C. & Zhao, Jun, 2020. "Doubly robust difference-in-differences estimators," Journal of Econometrics, Elsevier, vol. 219(1), pages 101-122.
    7. Michael Zimmert & Michael Lechner, 2019. "Nonparametric estimation of causal heterogeneity under high-dimensional confounding," Papers 1908.08779, arXiv.org.
    8. Qingliang Fan & Yu-Chin Hsu & Robert P. Lieli & Yichong Zhang, 2019. "Estimation of Conditional Average Treatment Effects with High-Dimensional Data," Papers 1908.02399, arXiv.org, revised Jul 2021.
    9. Zongwu Cai & Ying Fang & Ming Lin & Shengfang Tang, 2020. "Inferences for Partially Conditional Quantile Treatment Effect Model," WORKING PAPERS SERIES IN THEORETICAL AND APPLIED ECONOMICS 202005, University of Kansas, Department of Economics, revised Feb 2020.
    10. Callaway, Brantly, 2021. "Bounds on distributional treatment effect parameters using panel data with an application on job displacement," Journal of Econometrics, Elsevier, vol. 222(2), pages 861-881.
    11. Victor Chernozhukov & Iván Fernández‐Val & Blaise Melly, 2013. "Inference on Counterfactual Distributions," Econometrica, Econometric Society, vol. 81(6), pages 2205-2268, November.
    12. Goller, Daniel & Harrer, Tamara & Lechner, Michael & Wolff, Joachim, 2021. "Active labour market policies for the long-term unemployed: New evidence from causal machine learning," Economics Working Paper Series 2108, University of St. Gallen, School of Economics and Political Science.
    13. Miller, Steve, 2020. "Causal forest estimation of heterogeneous and time-varying environmental policy effects," Journal of Environmental Economics and Management, Elsevier, vol. 103(C).
    14. Sookyo Jeong & Hongseok Namkoong, 2020. "Robust Causal Inference Under Covariate Shift via Worst-Case Subpopulation Treatment Effects," Papers 2007.02411, arXiv.org, revised Jul 2020.
    15. Michael Knaus & Michael Lechner & Anthony Strittmatter, 2017. "Heterogeneous Employment Effects of Job Search Programmes: A Machine Learning Approach," Papers 1709.10279, arXiv.org, revised May 2018.
    16. Anna Baiardi & Andrea A. Naghi, 2021. "The Value Added of Machine Learning to Causal Inference: Evidence from Revisited Studies," Papers 2101.00878, arXiv.org.
    17. Anna Baiardi & Andrea A. Naghi, 2021. "The Value Added of Machine Learning to Causal Inference: Evidence from Revisited Studies," Tinbergen Institute Discussion Papers 21-001/V, Tinbergen Institute.
    18. Martin Huber, 2019. "An introduction to flexible methods for policy evaluation," Papers 1910.00641, arXiv.org.
    19. Davide Viviano & Jelena Bradic, 2019. "Synthetic learner: model-free inference on treatments over time," Papers 1904.01490, arXiv.org.
    20. Pablo Lavado & Gonzalo Rivera, 2016. "Identifying Treatment Effects with Data Combination and Unobserved Heterogeneity," Working Papers 2016-79, Peruvian Economic Association.

    More about this item

    JEL classification:

    • H75 - Public Economics - - State and Local Government; Intergovernmental Relations - - - State and Local Government: Health, Education, and Welfare
    • I38 - Health, Education, and Welfare - - Welfare, Well-Being, and Poverty - - - Government Programs; Provision and Effects of Welfare Programs
    • J22 - Labor and Demographic Economics - - Demand and Supply of Labor - - - Time Allocation and Labor Supply
    • J31 - Labor and Demographic Economics - - Wages, Compensation, and Labor Costs - - - Wage Level and Structure; Wage Differentials
    • C21 - Mathematical and Quantitative Methods - - Single Equation Models; Single Variables - - - Cross-Sectional Models; Spatial Models; Treatment Effect Models

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:arx:papers:1812.06533. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: . General contact details of provider: http://arxiv.org/ .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: arXiv administrators (email available below). General contact details of provider: http://arxiv.org/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.