IDEAS home Printed from https://ideas.repec.org/a/eee/ecolet/v167y2018icp131-135.html

Randomization bias in field trials to evaluate targeting methods

Author

Listed:
  • Potash, Eric

Abstract

This paper studies the evaluation of methods for targeting the allocation of limited resources to a high-risk subpopulation. We consider a randomized controlled trial to measure the difference in efficiency between two targeting methods and show that it is biased. An alternative, survey-based design is shown to be unbiased. Both designs are simulated for the evaluation of a policy to target lead hazard investigations using a predictive model. Based on our findings, we advised the Chicago Department of Public Health to use the survey design for their field trial. Our work anticipates further developments in economics that will be important as predictive modeling becomes an increasingly common policy tool.

Suggested Citation

  • Potash, Eric, 2018. "Randomization bias in field trials to evaluate targeting methods," Economics Letters, Elsevier, vol. 167(C), pages 131-135.
  • Handle: RePEc:eee:ecolet:v:167:y:2018:i:c:p:131-135
    DOI: 10.1016/j.econlet.2018.03.012
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0165176518301034
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.econlet.2018.03.012?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to

    for a different version of it.

    References listed on IDEAS

    as
    1. Jon Kleinberg & Jens Ludwig & Sendhil Mullainathan & Ziad Obermeyer, 2015. "Prediction Policy Problems," American Economic Review, American Economic Association, vol. 105(5), pages 491-495, May.
    2. James J. Heckman & Jeffrey A. Smith, 1995. "Assessing the Case for Social Experiments," Journal of Economic Perspectives, American Economic Association, vol. 9(2), pages 85-110, Spring.
    3. Sianesi, Barbara, 2017. "Evidence of randomisation bias in a large-scale social experiment: The case of ERA," Journal of Econometrics, Elsevier, vol. 198(1), pages 41-64.
    4. Angus Deaton & Nancy Cartwright, 2016. "Understanding and Misunderstanding Randomized Controlled Trials," Working Papers august_25.pdf, Princeton University, Woodrow Wilson School of Public and International Affairs, Research Program in Development Studies..
    5. Dana Chandler & Steven D. Levitt & John A. List, 2011. "Predicting and Preventing Shootings among At-Risk Youth," American Economic Review, American Economic Association, vol. 101(3), pages 288-292, May.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Michael Allan Ribers & Hannes Ullrich, 2024. "Complementarities between algorithmic and human decision-making: The case of antibiotic prescribing," Quantitative Marketing and Economics (QME), Springer, vol. 22(4), pages 445-483, December.
    2. Battiston, Pietro & Gamba, Simona & Santoro, Alessandro, 2024. "Machine learning and the optimization of prediction-based policies," Technological Forecasting and Social Change, Elsevier, vol. 199(C).
    3. Donald Moynihan, 2018. "A great schism approaching? Towards a micro and macro public administration," Journal of Behavioral Public Administration, Center for Experimental and Behavioral Public Administration, vol. 1(1).
    4. Alexander Ruder, 2019. "What Works at Scale? A Framework to Scale Up Workforce Development Programs," FRB Atlanta Community and Economic Development Discussion Paper 2019-1, Federal Reserve Bank of Atlanta.
    5. de Blasio, Guido & D'Ignazio, Alessio & Letta, Marco, 2022. "Gotham city. Predicting ‘corrupted’ municipalities with machine learning," Technological Forecasting and Social Change, Elsevier, vol. 184(C).
    6. Alexander Ruder, 2019. "What Works at Scale? A Framework to Scale Up Workforce Development Programs," FRB Atlanta Community and Economic Development Discussion Paper 2019-01, Federal Reserve Bank of Atlanta.
    7. Margaret Dalziel, 2018. "Why are there (almost) no randomised controlled trial-based evaluations of business support programmes?," Humanities and Social Sciences Communications, Palgrave Macmillan, vol. 4(1), pages 1-9, December.
    8. Monica Andini & Emanuele Ciani & Guido de Blasio & Alessio D'Ignazio & Viola Salvestrini, 2017. "Targeting policy-compliers with machine learning: an application to a tax rebate programme in Italy," Temi di discussione (Economic working papers) 1158, Bank of Italy, Economic Research and International Relations Area.
    9. Michael Allan Ribers & Hannes Ullrich, 2023. "Machine learning and physician prescribing: a path to reduced antibiotic use," Berlin School of Economics Discussion Papers 0019, Berlin School of Economics.
    10. Ginevra Buratti & Alessio D'Ignazio, 2024. "Improving the effectiveness of financial education programs. A targeting approach," Journal of Consumer Affairs, Wiley Blackwell, vol. 58(2), pages 451-485, June.
    11. Andini, Monica & Ciani, Emanuele & de Blasio, Guido & D'Ignazio, Alessio & Salvestrini, Viola, 2018. "Targeting with machine learning: An application to a tax rebate program in Italy," Journal of Economic Behavior & Organization, Elsevier, vol. 156(C), pages 86-102.
    12. Pietro Battiston & Simona Gamba & Alessandro Santoro, 2020. "Optimizing Tax Administration Policies with Machine Learning," Working Papers 436, University of Milano-Bicocca, Department of Economics, revised Mar 2020.
    13. Guido de Blasio & Alessio D'Ignazio & Marco Letta, 2020. "Predicting Corruption Crimes with Machine Learning. A Study for the Italian Municipalities," Working Papers 16/20, Sapienza University of Rome, DISS.
    14. Jeffrey A. Smith, 2018. "The usefulness of experiments," IZA World of Labor, Institute of Labor Economics (IZA), pages 436-436, May.
    15. Sophie-Charlotte Klose & Johannes Lederer, 2020. "A Pipeline for Variable Selection and False Discovery Rate Control With an Application in Labor Economics," Papers 2006.12296, arXiv.org, revised Jun 2020.
    16. Maria Cancian & Daniel R. Meyer & Robert G. Wood, 2022. "Do Carrots Work Better than Sticks? Results from the National Child Support Noncustodial Parent Employment Demonstration," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 41(2), pages 552-578, March.
    17. Baird, Matthew D. & Engberg, John & Gutierrez, Italo A., 2022. "RCT evidence on differential impact of US job training programmes by pre-training employment status," Labour Economics, Elsevier, vol. 75(C).
    18. Nicolaj N. Mühlbach, 2020. "Tree-based Synthetic Control Methods: Consequences of moving the US Embassy," CREATES Research Papers 2020-04, Department of Economics and Business Economics, Aarhus University.
    19. Sylvain Chassang & Erik Snowberg & Ben Seymour & Cayley Bowles, 2015. "Accounting for Behavior in Treatment Effects: New Applications for Blind Trials," PLOS ONE, Public Library of Science, vol. 10(6), pages 1-13, June.
    20. Teresa Molina Millán & Karen Macours, 2017. "Attrition in randomized control trials: Using tracking information to correct bias," FEUNL Working Paper Series novaf:wp1702, Universidade Nova de Lisboa, Faculdade de Economia.

    More about this item

    Keywords

    ;
    ;
    ;
    ;

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:ecolet:v:167:y:2018:i:c:p:131-135. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/ecolet .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.