IDEAS home Printed from https://ideas.repec.org/p/hit/hitcei/2021-05.html
   My bibliography  Save this paper

Algorithm is Experiment: Machine Learning, Market Design, and Policy Eligibility Rules

Author

Listed:
  • Narita, Yusuke
  • Yata, Kohei

Abstract

Algorithms produce a growing portion of decisions and recommendations both in policy and business. Such algorithmic decisions are natural experiments (conditionally quasirandomly assigned instruments) since the algorithms make decisions based only on observable input variables. We use this observation to develop a treatment-effect estimator for a class of stochastic and deterministic decision-making algorithms. Our estimator is shown to be consistent and asymptotically normal for well-defined causal effects. A key special case of our estimator is a multidimensional regression discontinuity design. We apply our estimator to evaluate the effect of the Coronavirus Aid, Relief, and Economic Security (CARES) Act, where hundreds of billions of dollars worth of relief funding is allocated to hospitals via an algorithmic rule. Our estimates suggest that the relief funding has little effect on COVID- 19-related hospital activity levels. Naive OLS and IV estimates exhibit substantial selection bias.

Suggested Citation

  • Narita, Yusuke & Yata, Kohei, 2022. "Algorithm is Experiment: Machine Learning, Market Design, and Policy Eligibility Rules," CEI Working Paper Series 2021-05, Center for Economic Institutions, Institute of Economic Research, Hitotsubashi University.
  • Handle: RePEc:hit:hitcei:2021-05
    Note: December 7, 2021
    as

    Download full text from publisher

    File URL: https://hermes-ir.lib.hit-u.ac.jp/hermes/ir/re/72545/wp2021-05.pdf
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Timothy B. Armstrong & Michal Kolesár, 2018. "Optimal Inference in a Class of Regression Models," Econometrica, Econometric Society, vol. 86(2), pages 655-683, March.
    2. Markus Frölich & Martin Huber, 2019. "Including Covariates in the Regression Discontinuity Design," Journal of Business & Economic Statistics, Taylor & Francis Journals, vol. 37(4), pages 736-748, October.
    3. Janet Currie & Jonathan Gruber, 1996. "Health Insurance Eligibility, Utilization of Medical Care, and Child Health," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 111(2), pages 431-466.
    4. Sebastian Calonico & Matias D. Cattaneo & Rocio Titiunik, 2014. "Robust Nonparametric Confidence Intervals for Regression‐Discontinuity Designs," Econometrica, Econometric Society, vol. 82, pages 2295-2326, November.
    5. Magdalena K Sobol & Sarah A Finkelstein, 2018. "Predictive pollen-based biome modeling using machine learning," PLOS ONE, Public Library of Science, vol. 13(8), pages 1-29, August.
    6. Atila Abdulkadiroğlu & Joshua D. Angrist & Yusuke Narita & Parag A. Pathak, 2017. "Research Design Meets Market Design: Using Centralized Assignment for Impact Evaluation," Econometrica, Econometric Society, vol. 85, pages 1373-1432, September.
    7. Yingying Dong, 2018. "Alternative Assumptions to Identify LATE in Fuzzy Regression Discontinuity Designs," Oxford Bulletin of Economics and Statistics, Department of Economics, University of Oxford, vol. 80(5), pages 1020-1027, October.
    8. Yusuke Narita & Shota Yasui & Kohei Yata, 2018. "Efficient Counterfactual Learning from Bandit Feedback," Cowles Foundation Discussion Papers 2155, Cowles Foundation for Research in Economics, Yale University.
    9. Neale Mahoney, 2015. "Bankruptcy as Implicit Health Insurance," American Economic Review, American Economic Association, vol. 105(2), pages 710-746, February.
    10. Borusyak, Kirill & Hull, Peter, 2020. "Non-Random Exposure to Exogenous Shocks: Theory and Applications," CEPR Discussion Papers 15319, C.E.P.R. Discussion Papers.
    11. David W Brown & Amanda E Kowalski & Ithai Z Lurie, 2020. "Long-Term Impacts of Childhood Medicaid Expansions on Outcomes in Adulthood," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 87(2), pages 792-821.
    12. Brigham R. Frandsen, 2017. "Party Bias in Union Representation Elections: Testing for Manipulation in the Regression Discontinuity Design when the Running Variable is Discrete," Advances in Econometrics, in: Regression Discontinuity Designs, volume 38, pages 281-315, Emerald Group Publishing Limited.
    13. Imbens, Guido W & Angrist, Joshua D, 1994. "Identification and Estimation of Local Average Treatment Effects," Econometrica, Econometric Society, vol. 62(2), pages 467-475, March.
    14. Abadie, Alberto, 2003. "Semiparametric instrumental variable estimation of treatment response models," Journal of Econometrics, Elsevier, vol. 113(2), pages 231-263, April.
    15. Guido Imbens & Karthik Kalyanaraman, 2012. "Optimal Bandwidth Choice for the Regression Discontinuity Estimator," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 79(3), pages 933-959.
    16. John J. Horton, 2017. "The Effects of Algorithmic Labor Market Recommendations: Evidence from a Field Experiment," Journal of Labor Economics, University of Chicago Press, vol. 35(2), pages 345-385.
    17. Jasjeet S. Sekhon & Rocío Titiunik, 2017. "On Interpreting the Regression Discontinuity Design as a Local Experiment," Advances in Econometrics, in: Regression Discontinuity Designs, volume 38, pages 1-28, Emerald Group Publishing Limited.
    18. Mark G. Duggan, 2000. "Hospital Ownership and Public Medical Spending," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 115(4), pages 1343-1373.
    19. Yusuke Narita, 2020. "A Theory of Quasi-Experimental Evaluation of School Quality," Working Papers 2020-085, Human Capital and Economic Opportunity Working Group.
    20. Susan Athey & Guido W. Imbens, 2017. "The State of Applied Econometrics: Causality and Policy Evaluation," Journal of Economic Perspectives, American Economic Association, vol. 31(2), pages 3-32, Spring.
    21. Sendhil Mullainathan & Jann Spiess, 2017. "Machine Learning: An Applied Econometric Approach," Journal of Economic Perspectives, American Economic Association, vol. 31(2), pages 87-106, Spring.
    22. Sylvain Chassang & Kei Kawai & Jun Nakabayashi & Juan Ortner, 2022. "Robust Screens for Noncompetitive Bidding in Procurement Auctions," Econometrica, Econometric Society, vol. 90(1), pages 315-346, January.
    23. Papay, John P. & Willett, John B. & Murnane, Richard J., 2011. "Extending the regression-discontinuity approach to multiple assignment variables," Journal of Econometrics, Elsevier, vol. 161(2), pages 203-207, April.
    24. A. Belloni & V. Chernozhukov & I. Fernández‐Val & C. Hansen, 2017. "Program Evaluation and Causal Inference With High‐Dimensional Data," Econometrica, Econometric Society, vol. 85, pages 233-298, January.
    25. Burt S. Barnow & Matias D. Cattaneo & Rocío Titiunik & Gonzalo Vazquez‐Bare, 2017. "Comparing Inference Approaches for RD Designs: A Reexamination of the Effect of Head Start on Child Mortality," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 36(3), pages 643-681, June.
    26. Hahn, Jinyong & Todd, Petra & Van der Klaauw, Wilbert, 2001. "Identification and Estimation of Treatment Effects with a Regression-Discontinuity Design," Econometrica, Econometric Society, vol. 69(1), pages 201-209, January.
    27. M. Kate Bundorf & Maria Polyakova & Ming Tai-Seale, 2019. "How do Humans Interact with Algorithms? Experimental Evidence from Health Insurance," NBER Working Papers 25976, National Bureau of Economic Research, Inc.
    28. Peter Cohen & Robert Hahn & Jonathan Hall & Steven Levitt & Robert Metcalfe, 2016. "Using Big Data to Estimate Consumer Surplus: The Case of Uber," NBER Working Papers 22627, National Bureau of Economic Research, Inc.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Yusuke Narita & Kyohei Okumura & Akihiro Shimizu & Kohei Yata, 2022. "Counterfactual Learning with General Data-generating Policies," Papers 2212.01925, arXiv.org.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Narita, Yusuke & Yata, Kohei, 2022. "Algorithm is Experiment: Machine Learning, Market Design, and Policy Eligibility Rules," Discussion Paper Series 730, Institute of Economic Research, Hitotsubashi University.
    2. Yusuke Narita & Kohei Yata, 2021. "Algorithm is Experiment: Machine Learning, Market Design, and Policy Eligibility Rules," Working Papers 2021-022, Human Capital and Economic Opportunity Working Group.
    3. Atı̇la Abdulkadı̇roğlu & Joshua D. Angrist & Yusuke Narita & Parag Pathak, 2022. "Breaking Ties: Regression Discontinuity Design Meets Market Design," Econometrica, Econometric Society, vol. 90(1), pages 117-151, January.
    4. Joshua D. Angrist, 2022. "Empirical Strategies in Economics: Illuminating the Path From Cause to Effect," Econometrica, Econometric Society, vol. 90(6), pages 2509-2539, November.
    5. Crespo Cristian, 2020. "Beyond Manipulation: Administrative Sorting in Regression Discontinuity Designs," Journal of Causal Inference, De Gruyter, vol. 8(1), pages 164-181, January.
    6. Huber, Martin, 2019. "An introduction to flexible methods for policy evaluation," FSES Working Papers 504, Faculty of Economics and Social Sciences, University of Freiburg/Fribourg Switzerland.
    7. Crespo Cristian, 2020. "Beyond Manipulation: Administrative Sorting in Regression Discontinuity Designs," Journal of Causal Inference, De Gruyter, vol. 8(1), pages 164-181, January.
    8. Matias D. Cattaneo & Rocío Titiunik, 2022. "Regression Discontinuity Designs," Annual Review of Economics, Annual Reviews, vol. 14(1), pages 821-851, August.
    9. Guido Imbens & Stefan Wager, 2019. "Optimized Regression Discontinuity Designs," The Review of Economics and Statistics, MIT Press, vol. 101(2), pages 264-278, May.
    10. Sebastian Calonico & Matias D Cattaneo & Max H Farrell, 2020. "Optimal bandwidth choice for robust bias-corrected inference in regression discontinuity designs [Econometric methods for program evaluation]," The Econometrics Journal, Royal Economic Society, vol. 23(2), pages 192-210.
    11. Peter Hull & Michal Kolesár & Christopher Walters, 2022. "Labor by design: contributions of David Card, Joshua Angrist, and Guido Imbens," Scandinavian Journal of Economics, Wiley Blackwell, vol. 124(3), pages 603-645, July.
    12. Tohari, Achmad & Parsons, Christopher & Rammohan, Anu, 2017. "Does Information Empower the Poor? Evidence from Indonesia's Social Security Card," IZA Discussion Papers 11137, Institute of Labor Economics (IZA).
    13. Blaise Melly & Rafael Lalive, 2020. "Estimation, Inference, and Interpretation in the Regression Discontinuity Design," Diskussionsschriften dp2016, Universitaet Bern, Departement Volkswirtschaft.
    14. Yoichi Arai & Yu‐Chin Hsu & Toru Kitagawa & Ismael Mourifié & Yuanyuan Wan, 2022. "Testing identifying assumptions in fuzzy regression discontinuity designs," Quantitative Economics, Econometric Society, vol. 13(1), pages 1-28, January.
    15. Myung Hwan Seo & Yoichi Arai & Taisuke Otsu, 2021. "Regression Discontinuity Design with Potentially Many Covariates," Working Paper Series no142, Institute of Economic Research, Seoul National University.
    16. Jin-young Choi & Myoung-jae Lee, 2017. "Regression discontinuity: review with extensions," Statistical Papers, Springer, vol. 58(4), pages 1217-1246, December.
    17. Yingying Dong & Michal Kolesár, 2023. "When can we ignore measurement error in the running variable?," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 38(5), pages 735-750, August.
    18. Matias D. Cattaneo & Luke Keele & Rocio Titiunik, 2021. "Covariate Adjustment in Regression Discontinuity Designs," Papers 2110.08410, arXiv.org, revised Aug 2022.
    19. Yiqi Liu & Yuan Qi, 2023. "Using Forests in Multivariate Regression Discontinuity Designs," Papers 2303.11721, arXiv.org.
    20. Onda, Masayuki & Seyler, Edward, 2020. "English learners reclassification and academic achievement: Evidence from Minnesota," Economics of Education Review, Elsevier, vol. 79(C).

    More about this item

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:hit:hitcei:2021-05. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Reiko Suzuki (email available below). General contact details of provider: https://edirc.repec.org/data/cehitjp.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.