IDEAS home Printed from https://ideas.repec.org/a/eee/ecolet/v167y2018icp131-135.html
   My bibliography  Save this article

Randomization bias in field trials to evaluate targeting methods

Author

Listed:
  • Potash, Eric

Abstract

This paper studies the evaluation of methods for targeting the allocation of limited resources to a high-risk subpopulation. We consider a randomized controlled trial to measure the difference in efficiency between two targeting methods and show that it is biased. An alternative, survey-based design is shown to be unbiased. Both designs are simulated for the evaluation of a policy to target lead hazard investigations using a predictive model. Based on our findings, we advised the Chicago Department of Public Health to use the survey design for their field trial. Our work anticipates further developments in economics that will be important as predictive modeling becomes an increasingly common policy tool.

Suggested Citation

  • Potash, Eric, 2018. "Randomization bias in field trials to evaluate targeting methods," Economics Letters, Elsevier, vol. 167(C), pages 131-135.
  • Handle: RePEc:eee:ecolet:v:167:y:2018:i:c:p:131-135
    DOI: 10.1016/j.econlet.2018.03.012
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0165176518301034
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.econlet.2018.03.012?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Jon Kleinberg & Jens Ludwig & Sendhil Mullainathan & Ziad Obermeyer, 2015. "Prediction Policy Problems," American Economic Review, American Economic Association, vol. 105(5), pages 491-495, May.
    2. James J. Heckman & Jeffrey A. Smith, 1995. "Assessing the Case for Social Experiments," Journal of Economic Perspectives, American Economic Association, vol. 9(2), pages 85-110, Spring.
    3. Sianesi, Barbara, 2017. "Evidence of randomisation bias in a large-scale social experiment: The case of ERA," Journal of Econometrics, Elsevier, vol. 198(1), pages 41-64.
    4. Deaton, Angus & Cartwright, Nancy, 2018. "Understanding and misunderstanding randomized controlled trials," Social Science & Medicine, Elsevier, vol. 210(C), pages 2-21.
    5. Angus Deaton & Nancy Cartwright, 2016. "Understanding and Misunderstanding Randomized Controlled Trials," Working Papers august_25.pdf, Princeton University, Woodrow Wilson School of Public and International Affairs, Research Program in Development Studies..
    6. Dana Chandler & Steven D. Levitt & John A. List, 2011. "Predicting and Preventing Shootings among At-Risk Youth," American Economic Review, American Economic Association, vol. 101(3), pages 288-292, May.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Donald Moynihan, 2018. "A great schism approaching? Towards a micro and macro public administration," Journal of Behavioral Public Administration, Center for Experimental and Behavioral Public Administration, vol. 1(1).
    2. Alexander Ruder, 2019. "What Works at Scale? A Framework to Scale Up Workforce Development Programs," FRB Atlanta Community and Economic Development Discussion Paper 2019-1, Federal Reserve Bank of Atlanta.
    3. Margaret Dalziel, 2018. "Why are there (almost) no randomised controlled trial-based evaluations of business support programmes?," Palgrave Communications, Palgrave Macmillan, vol. 4(1), pages 1-9, December.
    4. Christopher J. Ruhm, 2019. "Shackling the Identification Police?," Southern Economic Journal, John Wiley & Sons, vol. 85(4), pages 1016-1026, April.
    5. Vellore Arthi & James Fenske, 2018. "Polygamy and child mortality: Historical and modern evidence from Nigeria’s Igbo," Review of Economics of the Household, Springer, vol. 16(1), pages 97-141, March.
    6. Ravallion, Martin, 2020. "Highly prized experiments," World Development, Elsevier, vol. 127(C).
    7. Andreas C Drichoutis & Rodolfo M Nayga, 2020. "Economic Rationality under Cognitive Load," The Economic Journal, Royal Economic Society, vol. 130(632), pages 2382-2409.
    8. Cristina Corduneanu-Huci & Michael T. Dorsch & Paul Maarek, 2017. "Learning to constrain: Political competition and randomized controlled trials in development," THEMA Working Papers 2017-24, THEMA (THéorie Economique, Modélisation et Applications), Université de Cergy-Pontoise.
    9. Jörg Peters & Jörg Langbein & Gareth Roberts, 2018. "Generalization in the Tropics – Development Policy, Randomized Controlled Trials, and External Validity," The World Bank Research Observer, World Bank, vol. 33(1), pages 34-64.
    10. Maurizio Canavari & Andreas C. Drichoutis & Jayson L. Lusk & Rodolfo M. Nayga, Jr., 2018. "How to run an experimental auction: A review of recent advances," Working Papers 2018-5, Agricultural University of Athens, Department Of Agricultural Economics.
    11. Aufenanger, Tobias, 2018. "Treatment allocation for linear models," FAU Discussion Papers in Economics 14/2017, Friedrich-Alexander University Erlangen-Nuremberg, Institute for Economics, revised 2018.
    12. Vicky Chemutai & Hubert Escaith, 2017. "Measuring World Trade Organization (WTO) Accession Commitments and their Economic Effects," Journal of International Commerce, Economics and Policy (JICEP), World Scientific Publishing Co. Pte. Ltd., vol. 8(02), pages 1-27, June.
    13. Ashkan Pakseresht & Brandon R McFadden & Carl Johan Lagerkvist, 2017. "Consumer acceptance of food biotechnology based on policy context and upstream acceptance: evidence from an artefactual field experiment," European Review of Agricultural Economics, Oxford University Press and the European Agricultural and Applied Economics Publications Foundation, vol. 44(5), pages 757-780.
    14. Sutherland, Alex & Ariel, Barak & Farrar, William & De Anda, Randy, 2017. "Post-experimental follow-ups—Fade-out versus persistence effects: The Rialto police body-worn camera experiment four years on," Journal of Criminal Justice, Elsevier, vol. 53(C), pages 110-116.
    15. Hanushek, Eric A., 2021. "Addressing cross-national generalizability in educational impact evaluation," International Journal of Educational Development, Elsevier, vol. 80(C).
    16. Robin Maialeh, 2019. "Generalization of results and neoclassical rationality: unresolved controversies of behavioural economics methodology," Quality & Quantity: International Journal of Methodology, Springer, vol. 53(4), pages 1743-1761, July.
    17. Huber, Martin & Steinmayr, Andreas, 2017. "A framework for separating individual treatment effects from spillover, interaction, and general equilibrium effects," FSES Working Papers 481, Faculty of Economics and Social Sciences, University of Freiburg/Fribourg Switzerland.
    18. Gustavo Canavire-Bacarreza & Luis Castro Peñarrieta & Darwin Ugarte Ontiveros, 2021. "Outliers in Semi-Parametric Estimation of Treatment Effects," Econometrics, MDPI, vol. 9(2), pages 1-32, April.
    19. Ashkan Pakseresht & Anna Kristina Edenbrandt & Carl Johan Lagerkvist, 2021. "Genetically modified food and consumer risk responsibility: The effect of regulatory design and risk type on cognitive information processing," PLOS ONE, Public Library of Science, vol. 16(6), pages 1-21, June.
    20. Florent Bédécarrats & Isabelle Guérin & François Roubaud, 2019. "All that Glitters is not Gold. The Political Economy of Randomized Evaluations in Development," Development and Change, International Institute of Social Studies, vol. 50(3), pages 735-762, May.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:ecolet:v:167:y:2018:i:c:p:131-135. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/ecolet .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.