IDEAS home Printed from https://ideas.repec.org/p/zbw/glodps/336.html
   My bibliography  Save this paper

What Is the Value Added by Using Causal Machine Learning Methods in a Welfare Experiment Evaluation?

Author

Listed:
  • Strittmatter, Anthony

Abstract

Recent studies have proposed causal machine learning (CML) methods to estimate conditional average treatment effects (CATEs). In this study, I investigate whether CML methods add value compared to conventional CATE estimators by re-evaluating Connecticut's Jobs First welfare experiment. This experiment entails a mix of positive and negative work incentives. Previous studies show that it is hard to tackle the effect heterogeneity of Jobs First by means of CATEs. I report evidence that CML methods can provide support for the theoretical labor supply predictions. Furthermore, I document reasons why some conventional CATE estimators fail and discuss the limitations of CML methods.

Suggested Citation

  • Strittmatter, Anthony, 2019. "What Is the Value Added by Using Causal Machine Learning Methods in a Welfare Experiment Evaluation?," GLO Discussion Paper Series 336, Global Labor Organization (GLO).
  • Handle: RePEc:zbw:glodps:336
    as

    Download full text from publisher

    File URL: https://www.econstor.eu/bitstream/10419/194352/1/GLO-DP-0336.pdf
    Download Restriction: no

    Other versions of this item:

    References listed on IDEAS

    as
    1. Sokbae Lee & Ryo Okui & Yoon†Jae Whang, 2017. "Doubly robust uniform confidence band for the conditional average treatment effect function," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 32(7), pages 1207-1225, November.
    2. Patrick Kline & Melissa Tartari, 2016. "Bounding the Labor Supply Responses to a Randomized Welfare Experiment: A Revealed Preference Approach," American Economic Review, American Economic Association, vol. 106(4), pages 972-1014, April.
    3. Sergio Firpo, 2007. "Efficient Semiparametric Estimation of Quantile Treatment Effects," Econometrica, Econometric Society, vol. 75(1), pages 259-276, January.
    4. Victor Chernozhukov & Denis Chetverikov & Mert Demirer & Esther Duflo & Christian Hansen & Whitney Newey & James Robins, 2018. "Double/debiased machine learning for treatment and structural parameters," Econometrics Journal, Royal Economic Society, vol. 21(1), pages 1-68, February.
    5. Garry F. Barrett & Stephen G. Donald, 2003. "Consistent Tests for Stochastic Dominance," Econometrica, Econometric Society, vol. 71(1), pages 71-104, January.
    6. Victor Chernozhukov & Mert Demirer & Esther Duflo & Ivan Fernandez-Val, 2017. "Generic machine learning inference on heterogenous treatment effects in randomized experiments," CeMMAP working papers CWP61/17, Centre for Microdata Methods and Practice, Institute for Fiscal Studies.
    7. James J. Heckman & Jeffrey Smith & Nancy Clements, 1997. "Making The Most Out Of Programme Evaluations and Social Experiments: Accounting For Heterogeneity in Programme Impacts," Review of Economic Studies, Oxford University Press, vol. 64(4), pages 487-535.
    8. Shuai Chen & Lu Tian & Tianxi Cai & Menggang Yu, 2017. "A general statistical framework for subgroup identification and comparative treatment scoring," Biometrics, The International Biometric Society, vol. 73(4), pages 1199-1209, December.
    9. Victor Chernozhukov & Iván Fernández‐Val & Ye Luo, 2018. "The Sorted Effects Method: Discovering Heterogeneous Effects Beyond Their Averages," Econometrica, Econometric Society, vol. 86(6), pages 1911-1938, November.
    10. Stefan Wager & Susan Athey, 2018. "Estimation and Inference of Heterogeneous Treatment Effects using Random Forests," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 113(523), pages 1228-1242, July.
    11. Michael Lechner & Anthony Strittmatter, 2019. "Practical procedures to deal with common support problems in matching estimation," Econometric Reviews, Taylor & Francis Journals, vol. 38(2), pages 193-207, February.
    12. Xinkun Nie & Stefan Wager, 2017. "Quasi-Oracle Estimation of Heterogeneous Treatment Effects," Papers 1712.04912, arXiv.org, revised Feb 2019.
    13. Alberto Abadie & Matthew M. Chingos & Martin R. West, 2018. "Endogenous Stratification in Randomized Experiments," The Review of Economics and Statistics, MIT Press, vol. 100(4), pages 567-580, October.
    14. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    15. Michael Knaus & Michael Lechner & Anthony Strittmatter, 2017. "Heterogeneous Employment Effects of Job Search Programmes: A Machine Learning Approach," Papers 1709.10279, arXiv.org, revised May 2018.
    16. Jonathan M.V. Davis & Sara B. Heller, 2017. "Using Causal Forests to Predict Treatment Heterogeneity: An Application to Summer Jobs," American Economic Review, American Economic Association, vol. 107(5), pages 546-550, May.
    17. Matt Taddy & Matt Gardner & Liyun Chen & David Draper, 2016. "A Nonparametric Bayesian Analysis of Heterogenous Treatment Effects in Digital Experimentation," Journal of Business & Economic Statistics, Taylor & Francis Journals, vol. 34(4), pages 661-672, October.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Lechner, Michael, 2018. "Modified Causal Forests for Estimating Heterogeneous Causal Effects," IZA Discussion Papers 12040, Institute of Labor Economics (IZA).
    2. Knaus, Michael C. & Lechner, Michael & Strittmatter, Anthony, 2018. "Machine Learning Estimation of Heterogeneous Causal Effects: Empirical Monte Carlo Evidence," IZA Discussion Papers 12039, Institute of Labor Economics (IZA).

    More about this item

    Keywords

    Labor supply; individualized treatment effects; conditional average treatment effects; random forest;

    JEL classification:

    • H75 - Public Economics - - State and Local Government; Intergovernmental Relations - - - State and Local Government: Health, Education, and Welfare
    • I38 - Health, Education, and Welfare - - Welfare, Well-Being, and Poverty - - - Government Programs; Provision and Effects of Welfare Programs
    • J22 - Labor and Demographic Economics - - Demand and Supply of Labor - - - Time Allocation and Labor Supply
    • J31 - Labor and Demographic Economics - - Wages, Compensation, and Labor Costs - - - Wage Level and Structure; Wage Differentials
    • C21 - Mathematical and Quantitative Methods - - Single Equation Models; Single Variables - - - Cross-Sectional Models; Spatial Models; Treatment Effect Models

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:zbw:glodps:336. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (ZBW - Leibniz Information Centre for Economics). General contact details of provider: http://edirc.repec.org/data/glaboea.html .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.