IDEAS home Printed from https://ideas.repec.org/p/wbk/wbrwps/7746.html
   My bibliography  Save this paper

On minimizing the risk of bias in randomized controlled trials in economics

Author

Listed:
  • Eble,Alex
  • Boone,Peter
  • Elbourne,Diana

Abstract

Estimation of empirical relationships is prone to bias. Economists have carefully studied sources of bias in structural and quasi-experimental approaches, but the randomized control trial (RCT) has only begun to receive such scrutiny. This paper argues that several lessons from medicine, derived from analysis of thousands of RCTs establishing a clear link between certain practices and biased estimates, can be used to reduce the risk of bias in economics RCTs. It identifies the subset of these lessons applicable to economics and uses them to assess risk of bias in estimates from economics RCTs published between 2001 and 2011. In comparison to medical studies, most economics studies examined do not report important details on study design necessary to assess risk of bias. Many report practices that suggest risk of bias, though this does not necessarily mean bias resulted. The paper concludes with suggestions on how to remedy these issues.

Suggested Citation

  • Eble,Alex & Boone,Peter & Elbourne,Diana, 2016. "On minimizing the risk of bias in randomized controlled trials in economics," Policy Research Working Paper Series 7746, The World Bank.
  • Handle: RePEc:wbk:wbrwps:7746
    as

    Download full text from publisher

    File URL: http://documents.worldbank.org/curated/en/230271468862629679/pdf/WPS7746.pdf
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Frijters, Paul & Kong, Tao Sherry & Liu, Elaine M., 2015. "Who is coming to the artefactual field experiment? Participation bias among Chinese rural migrants," Journal of Economic Behavior & Organization, Elsevier, vol. 114(C), pages 62-74.
    2. Joshua D. Angrist & Jörn-Steffen Pischke, 2010. "The Credibility Revolution in Empirical Economics: How Better Research Design Is Taking the Con out of Econometrics," Journal of Economic Perspectives, American Economic Association, vol. 24(2), pages 3-30, Spring.
    3. DiNardo, John & Lee, David S., 2011. "Program Evaluation and Research Designs," Handbook of Labor Economics, in: O. Ashenfelter & D. Card (ed.), Handbook of Labor Economics, edition 1, volume 4, chapter 5, pages 463-536, Elsevier.
    4. Duflo, Esther & Glennerster, Rachel & Kremer, Michael, 2008. "Using Randomization in Development Economics Research: A Toolkit," Handbook of Development Economics, in: T. Paul Schultz & John A. Strauss (ed.), Handbook of Development Economics, edition 1, volume 4, chapter 61, pages 3895-3962, Elsevier.
    5. Gordon H Guyatt & Edward J Mills & Diana Elbourne, 2008. "In the Era of Systematic Reviews, Does the Size of an Individual Trial Still Matter?," PLOS Medicine, Public Library of Science, vol. 5(1), pages 1-3, January.
    6. Jeffrey R Kling & Jeffrey B Liebman & Lawrence F Katz, 2007. "Experimental Analysis of Neighborhood Effects," Econometrica, Econometric Society, vol. 75(1), pages 83-119, January.
    7. Leonard, Kenneth & Masatu, Melkiory C., 2006. "Outpatient process quality evaluation and the Hawthorne Effect," Social Science & Medicine, Elsevier, vol. 63(9), pages 2330-2340, November.
    8. Miriam Bruhn & David McKenzie, 2009. "In Pursuit of Balance: Randomization in Practice in Development Field Experiments," American Economic Journal: Applied Economics, American Economic Association, vol. 1(4), pages 200-232, October.
    9. Emily Oster, 2013. "Unobservable Selection and Coefficient Stability: Theory and Validation," NBER Working Papers 19054, National Bureau of Economic Research, Inc.
    10. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    11. Ashenfelter, Orley & Harmon, Colm & Oosterbeek, Hessel, 1999. "A review of estimates of the schooling/earnings relationship, with tests for publication bias," Labour Economics, Elsevier, vol. 6(4), pages 453-470, November.
    12. Heckman, James, 2013. "Sample selection bias as a specification error," Applied Econometrics, Russian Presidential Academy of National Economy and Public Administration (RANEPA), vol. 31(3), pages 129-137.
    13. Jeffrey M Wooldridge, 2010. "Econometric Analysis of Cross Section and Panel Data," MIT Press Books, The MIT Press, edition 2, volume 1, number 0262232588, December.
    14. Kodrzycki Yolanda K. & Yu Pingkang, 2006. "New Approaches to Ranking Economics Journals," The B.E. Journal of Economic Analysis & Policy, De Gruyter, vol. 5(1), pages 1-44, August.
    15. Kenneth F Schulz & Douglas G Altman & David Moher & for the CONSORT Group, 2010. "CONSORT 2010 Statement: Updated Guidelines for Reporting Parallel Group Randomised Trials," PLOS Medicine, Public Library of Science, vol. 7(3), pages 1-7, March.
    16. Rachel Glennerster & Kudzai Takavarasha, 2013. "Running Randomized Evaluations: A Practical Guide," Economics Books, Princeton University Press, edition 1, number 10085.
    17. Eva Vivalt, 0. "How Much Can We Generalize From Impact Evaluations?," Journal of the European Economic Association, European Economic Association, vol. 18(6), pages 3045-3089.
    18. Orley Ashenfelter & Colm Harmon & Hessel Oosterbeek, 1999. "A Review of Estimates of the Schooling/Earnings Relationship, with Tests for Publication Bias," Working Papers 804, Princeton University, Department of Economics, Industrial Relations Section..
    19. Katherine Casey & Rachel Glennerster & Edward Miguel, 2012. "Reshaping Institutions: Evidence on Aid Impacts Using a Preanalysis Plan," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 127(4), pages 1755-1812.
    20. repec:fth:prinin:425 is not listed on IDEAS
    21. Akresh, Richard & de Walque, Damien & Kazianga, Harounan, 2013. "Cash transfers and child schooling : evidence from a randomized evaluation of the role of conditionality," Policy Research Working Paper Series 6340, The World Bank.
    22. Eva Vivalt, 2020. "How Much Can We Generalize From Impact Evaluations?," Journal of the European Economic Association, European Economic Association, vol. 18(6), pages 3045-3089.
    23. Manski, Charles F., 2013. "Public Policy in an Uncertain World: Analysis and Decisions," Economics Books, Harvard University Press, number 9780674066892, Spring.
    24. Charles F. Manski, 2013. "Response to the Review of ‘Public Policy in an Uncertain World’," Economic Journal, Royal Economic Society, vol. 0, pages 412-415, August.
    25. Angus Deaton, 2010. "Instruments, Randomization, and Learning about Development," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 424-455, June.
    26. Benjamin A. Olken, 2015. "Promises and Perils of Pre-analysis Plans," Journal of Economic Perspectives, American Economic Association, vol. 29(3), pages 61-80, Summer.
    27. Kerry Dwan & Douglas G Altman & Juan A Arnaiz & Jill Bloom & An-Wen Chan & Eugenia Cronin & Evelyne Decullier & Philippa J Easterbrook & Erik Von Elm & Carrol Gamble & Davina Ghersi & John P A Ioannid, 2008. "Systematic Review of the Empirical Evidence of Study Publication Bias and Outcome Reporting Bias," PLOS ONE, Public Library of Science, vol. 3(8), pages 1-31, August.
    28. Danzon, Patricia M. & Nicholson, Sean & Pereira, Nuno Sousa, 2005. "Productivity in pharmaceutical-biotechnology R&D: the role of experience and alliances," Journal of Health Economics, Elsevier, vol. 24(2), pages 317-339, March.
    29. Hunt Allcott, 2015. "Site Selection Bias in Program Evaluation," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 130(3), pages 1117-1165.
    30. J. L. Hutton & Paula R. Williamson, 2000. "Bias in meta‐analysis due to outcome variable selection within studies," Journal of the Royal Statistical Society Series C, Royal Statistical Society, vol. 49(3), pages 359-370.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Garret Christensen & Edward Miguel, 2018. "Transparency, Reproducibility, and the Credibility of Economics Research," Journal of Economic Literature, American Economic Association, vol. 56(3), pages 920-980, September.
    2. Orla Doyle, 2017. "The First 2,000 Days and Child Skills: Evidence from a Randomized Experiment of Home Visiting," Working Papers 201715, School of Economics, University College Dublin.
    3. Pascaline Dupas & Edward Miguel, 2016. "Impacts and Determinants of Health Levels in Low-Income Countries," NBER Working Papers 22235, National Bureau of Economic Research, Inc.
    4. Jörg Peters & Jörg Langbein & Gareth Roberts, 2018. "Generalization in the Tropics – Development Policy, Randomized Controlled Trials, and External Validity," The World Bank Research Observer, World Bank, vol. 33(1), pages 34-64.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Susan Athey & Guido Imbens, 2016. "The Econometrics of Randomized Experiments," Papers 1607.00698, arXiv.org.
    2. Garret Christensen & Edward Miguel, 2018. "Transparency, Reproducibility, and the Credibility of Economics Research," Journal of Economic Literature, American Economic Association, vol. 56(3), pages 920-980, September.
    3. Christopher Snyder & Ran Zhuo, 2018. "Sniff Tests as a Screen in the Publication Process: Throwing out the Wheat with the Chaff," NBER Working Papers 25058, National Bureau of Economic Research, Inc.
    4. Susan Athey & Raj Chetty & Guido Imbens, 2020. "Combining Experimental and Observational Data to Estimate Treatment Effects on Long Term Outcomes," Papers 2006.09676, arXiv.org.
    5. Jörg Peters & Jörg Langbein & Gareth Roberts, 2018. "Generalization in the Tropics – Development Policy, Randomized Controlled Trials, and External Validity," The World Bank Research Observer, World Bank, vol. 33(1), pages 34-64.
    6. Fernando Hoces de la Guardia & Sean Grant & Edward Miguel, 2021. "A framework for open policy analysis," Science and Public Policy, Oxford University Press, vol. 48(2), pages 154-163.
    7. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    8. Jason T. Kerwin & Rebecca L. Thornton, 2021. "Making the Grade: The Sensitivity of Education Program Effectiveness to Input Choices and Outcome Measures," The Review of Economics and Statistics, MIT Press, vol. 103(2), pages 251-264, May.
    9. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    10. Alfredo Di Tillio & Marco Ottaviani & Peter Norman Sørensen, 2017. "Persuasion Bias in Science: Can Economics Help?," Economic Journal, Royal Economic Society, vol. 127(605), pages 266-304, October.
    11. Snyder, Christopher & Zhuo, Ran, 2018. "Sniff Tests in Economics: Aggregate Distribution of Their Probability Values and Implications for Publication Bias," MetaArXiv 8vdrh, Center for Open Science.
    12. Baum-Snow, Nathaniel & Ferreira, Fernando, 2015. "Causal Inference in Urban and Regional Economics," Handbook of Regional and Urban Economics, in: Gilles Duranton & J. V. Henderson & William C. Strange (ed.), Handbook of Regional and Urban Economics, edition 1, volume 5, chapter 0, pages 3-68, Elsevier.
    13. Naik, Gopal & Chitre, Chetan & Bhalla, Manaswini & Rajan, Jothsna, 2020. "Impact of use of technology on student learning outcomes: Evidence from a large-scale experiment in India," World Development, Elsevier, vol. 127(C).
    14. Esterling, Kevin & Brady, David & Schwitzgebel, Eric, 2021. "The Necessity of Construct and External Validity for Generalized Causal Claims," OSF Preprints 2s8w5, Center for Open Science.
    15. Peter Boone & Alex Eble & Diana Elbourne, 2013. "Risk and Evidence of Bias in Randomized Controlled Trials in Economics," CEP Discussion Papers dp1240, Centre for Economic Performance, LSE.
    16. Furukawa, Chishio, 2019. "Publication Bias under Aggregation Frictions: Theory, Evidence, and a New Correction Method," EconStor Preprints 194798, ZBW - Leibniz Information Centre for Economics.
    17. Opoku-Agyemang, Kweku A., 2017. "A Human-Computer Interaction Approach for Integrity in Economics," SocArXiv ra3cs, Center for Open Science.
    18. Stefano DellaVigna & Elizabeth Linos, 2022. "RCTs to Scale: Comprehensive Evidence From Two Nudge Units," Econometrica, Econometric Society, vol. 90(1), pages 81-116, January.
    19. Roman Horvath & Ali Elminejad & Tomas Havranek, 2020. "Publication and Identification Biases in Measuring the Intertemporal Substitution of Labor Supply," Working Papers IES 2020/32, Charles University Prague, Faculty of Social Sciences, Institute of Economic Studies, revised Sep 2020.
    20. Burlig, Fiona, 2018. "Improving transparency in observational social science research: A pre-analysis plan approach," Economics Letters, Elsevier, vol. 168(C), pages 56-60.

    More about this item

    Keywords

    Industrial Economics; Economic Growth; Economic Theory&Research;
    All these keywords.

    JEL classification:

    • C90 - Mathematical and Quantitative Methods - - Design of Experiments - - - General
    • C93 - Mathematical and Quantitative Methods - - Design of Experiments - - - Field Experiments

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:wbk:wbrwps:7746. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Roula I. Yazigi (email available below). General contact details of provider: https://edirc.repec.org/data/dvewbus.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.