IDEAS home Printed from https://ideas.repec.org/a/oup/wbecrv/v31y2017i3p687-707..html
   My bibliography  Save this article

On Minimizing the Risk of Bias in Randomized Controlled Trials in Economics

Author

Listed:
  • Alex Eble
  • Peter Boone
  • Diana Elbourne

Abstract

Estimation of empirical relationships is prone to bias. Economists have carefully studied sources of bias in structural and quasi-experimental approaches, but the randomized control trial (RCT) has only begun to receive such scrutiny. In this paper, we argue that several lessons from medicine, derived from analysis of thousands of RCTs establishing a clear link between certain practices and biased estimates, can be used to reduce the risk of bias in economics RCTs. We identify the subset of these lessons applicable to economics and use them to assess risk of bias in estimates from economics RCTs published between 2001 and 2011. In comparison to medical studies, we find most economics studies do not report important details on study design necessary to assess risk of bias. Many report practices that suggest risk of bias, though this does not necessarily mean bias resulted. We conclude with suggestions on how to remedy these issues.

Suggested Citation

  • Alex Eble & Peter Boone & Diana Elbourne, 2017. "On Minimizing the Risk of Bias in Randomized Controlled Trials in Economics," The World Bank Economic Review, World Bank, vol. 31(3), pages 687-707.
  • Handle: RePEc:oup:wbecrv:v:31:y:2017:i:3:p:687-707.
    as

    Download full text from publisher

    File URL: http://hdl.handle.net/10.1093/wber/lhw034
    Download Restriction: Access to full text is restricted to subscribers.
    ---><---

    As the access to this document is restricted, you may want to look for a different version below or search for a different version of it.

    Other versions of this item:

    References listed on IDEAS

    as
    1. Charles F. Manski, 2013. "Response to the Review of ‘Public Policy in an Uncertain World’," Economic Journal, Royal Economic Society, vol. 0, pages 412-415, August.
    2. Miriam Bruhn & David McKenzie, 2009. "In Pursuit of Balance: Randomization in Practice in Development Field Experiments," American Economic Journal: Applied Economics, American Economic Association, vol. 1(4), pages 200-232, October.
    3. Joshua D. Angrist & Jörn-Steffen Pischke, 2010. "The Credibility Revolution in Empirical Economics: How Better Research Design Is Taking the Con out of Econometrics," Journal of Economic Perspectives, American Economic Association, vol. 24(2), pages 3-30, Spring.
    4. Angus Deaton, 2010. "Instruments, Randomization, and Learning about Development," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 424-455, June.
    5. Kodrzycki Yolanda K. & Yu Pingkang, 2006. "New Approaches to Ranking Economics Journals," The B.E. Journal of Economic Analysis & Policy, De Gruyter, vol. 5(1), pages 1-44, August.
    6. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    7. Kenneth F Schulz & Douglas G Altman & David Moher & for the CONSORT Group, 2010. "CONSORT 2010 Statement: Updated Guidelines for Reporting Parallel Group Randomised Trials," PLOS Medicine, Public Library of Science, vol. 7(3), pages 1-7, March.
    8. Duflo, Esther & Glennerster, Rachel & Kremer, Michael, 2008. "Using Randomization in Development Economics Research: A Toolkit," Handbook of Development Economics, in: T. Paul Schultz & John A. Strauss (ed.), Handbook of Development Economics, edition 1, volume 4, chapter 61, pages 3895-3962, Elsevier.
    9. Kerry Dwan & Douglas G Altman & Juan A Arnaiz & Jill Bloom & An-Wen Chan & Eugenia Cronin & Evelyne Decullier & Philippa J Easterbrook & Erik Von Elm & Carrol Gamble & Davina Ghersi & John P A Ioannid, 2008. "Systematic Review of the Empirical Evidence of Study Publication Bias and Outcome Reporting Bias," PLOS ONE, Public Library of Science, vol. 3(8), pages 1-31, August.
    10. Eva Vivalt, 2020. "How Much Can We Generalize From Impact Evaluations?," Journal of the European Economic Association, European Economic Association, vol. 18(6), pages 3045-3089.
    11. Frijters, Paul & Kong, Tao Sherry & Liu, Elaine M., 2015. "Who is coming to the artefactual field experiment? Participation bias among Chinese rural migrants," Journal of Economic Behavior & Organization, Elsevier, vol. 114(C), pages 62-74.
    12. Rachel Glennerster & Kudzai Takavarasha, 2013. "Running Randomized Evaluations: A Practical Guide," Economics Books, Princeton University Press, edition 1, number 10085.
    13. Manski, Charles F., 2013. "Public Policy in an Uncertain World: Analysis and Decisions," Economics Books, Harvard University Press, number 9780674066892, Spring.
    14. Danzon, Patricia M. & Nicholson, Sean & Pereira, Nuno Sousa, 2005. "Productivity in pharmaceutical-biotechnology R&D: the role of experience and alliances," Journal of Health Economics, Elsevier, vol. 24(2), pages 317-339, March.
    15. Hunt Allcott, 2015. "Site Selection Bias in Program Evaluation," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 130(3), pages 1117-1165.
    16. Katherine Casey & Rachel Glennerster & Edward Miguel, 2012. "Reshaping Institutions: Evidence on Aid Impacts Using a Preanalysis Plan," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 127(4), pages 1755-1812.
    17. Heckman, James, 2013. "Sample selection bias as a specification error," Applied Econometrics, Russian Presidential Academy of National Economy and Public Administration (RANEPA), vol. 31(3), pages 129-137.
    18. J. L. Hutton & Paula R. Williamson, 2000. "Bias in meta‐analysis due to outcome variable selection within studies," Journal of the Royal Statistical Society Series C, Royal Statistical Society, vol. 49(3), pages 359-370.
    19. DiNardo, John & Lee, David S., 2011. "Program Evaluation and Research Designs," Handbook of Labor Economics, in: O. Ashenfelter & D. Card (ed.), Handbook of Labor Economics, edition 1, volume 4, chapter 5, pages 463-536, Elsevier.
    20. Eva Vivalt, 0. "How Much Can We Generalize From Impact Evaluations?," Journal of the European Economic Association, European Economic Association, vol. 18(6), pages 3045-3089.
    21. Jeffrey M Wooldridge, 2010. "Econometric Analysis of Cross Section and Panel Data," MIT Press Books, The MIT Press, edition 2, volume 1, number 0262232588, December.
    22. Emily Oster, 2013. "Unobservable Selection and Coefficient Stability: Theory and Validation," NBER Working Papers 19054, National Bureau of Economic Research, Inc.
    23. Orley Ashenfelter & Colm Harmon & Hessel Oosterbeek, 1999. "A Review of Estimates of the Schooling/Earnings Relationship, with Tests for Publication Bias," Working Papers 804, Princeton University, Department of Economics, Industrial Relations Section..
    24. Benjamin A. Olken, 2015. "Promises and Perils of Pre-analysis Plans," Journal of Economic Perspectives, American Economic Association, vol. 29(3), pages 61-80, Summer.
    25. Ashenfelter, Orley & Harmon, Colm & Oosterbeek, Hessel, 1999. "A review of estimates of the schooling/earnings relationship, with tests for publication bias," Labour Economics, Elsevier, vol. 6(4), pages 453-470, November.
    26. Gordon H Guyatt & Edward J Mills & Diana Elbourne, 2008. "In the Era of Systematic Reviews, Does the Size of an Individual Trial Still Matter?," PLOS Medicine, Public Library of Science, vol. 5(1), pages 1-3, January.
    27. Jeffrey R Kling & Jeffrey B Liebman & Lawrence F Katz, 2007. "Experimental Analysis of Neighborhood Effects," Econometrica, Econometric Society, vol. 75(1), pages 83-119, January.
    28. Leonard, Kenneth & Masatu, Melkiory C., 2006. "Outpatient process quality evaluation and the Hawthorne Effect," Social Science & Medicine, Elsevier, vol. 63(9), pages 2330-2340, November.
    29. Richard Akresh & Damien de Walque & Harounan Kazianga, 2013. "Cash Transfers and Child Schooling: Evidence from a Randomized Evaluation of the Role of Conditionality," Economics Working Paper Series 1301, Oklahoma State University, Department of Economics and Legal Studies in Business.
    30. repec:fth:prinin:425 is not listed on IDEAS
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Garret Christensen & Edward Miguel, 2018. "Transparency, Reproducibility, and the Credibility of Economics Research," Journal of Economic Literature, American Economic Association, vol. 56(3), pages 920-980, September.
    2. Jörg Peters & Jörg Langbein & Gareth Roberts, 2018. "Generalization in the Tropics – Development Policy, Randomized Controlled Trials, and External Validity," The World Bank Research Observer, World Bank, vol. 33(1), pages 34-64.
    3. Orla Doyle, 2017. "The First 2,000 Days and Child Skills: Evidence from a Randomized Experiment of Home Visiting," Working Papers 201715, School of Economics, University College Dublin.
    4. Pascaline Dupas & Edward Miguel, 2016. "Impacts and Determinants of Health Levels in Low-Income Countries," NBER Working Papers 22235, National Bureau of Economic Research, Inc.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Susan Athey & Guido Imbens, 2016. "The Econometrics of Randomized Experiments," Papers 1607.00698, arXiv.org.
    2. Garret Christensen & Edward Miguel, 2018. "Transparency, Reproducibility, and the Credibility of Economics Research," Journal of Economic Literature, American Economic Association, vol. 56(3), pages 920-980, September.
    3. Christopher Snyder & Ran Zhuo, 2018. "Sniff Tests as a Screen in the Publication Process: Throwing out the Wheat with the Chaff," NBER Working Papers 25058, National Bureau of Economic Research, Inc.
    4. Susan Athey & Raj Chetty & Guido Imbens, 2020. "Combining Experimental and Observational Data to Estimate Treatment Effects on Long Term Outcomes," Papers 2006.09676, arXiv.org.
    5. Jörg Peters & Jörg Langbein & Gareth Roberts, 2018. "Generalization in the Tropics – Development Policy, Randomized Controlled Trials, and External Validity," The World Bank Research Observer, World Bank, vol. 33(1), pages 34-64.
    6. Fernando Hoces de la Guardia & Sean Grant & Edward Miguel, 2021. "A framework for open policy analysis," Science and Public Policy, Oxford University Press, vol. 48(2), pages 154-163.
    7. Jason T. Kerwin & Rebecca L. Thornton, 2021. "Making the Grade: The Sensitivity of Education Program Effectiveness to Input Choices and Outcome Measures," The Review of Economics and Statistics, MIT Press, vol. 103(2), pages 251-264, May.
    8. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    9. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    10. Esterling, Kevin & Brady, David & Schwitzgebel, Eric, 2021. "The Necessity of Construct and External Validity for Generalized Causal Claims," OSF Preprints 2s8w5, Center for Open Science.
    11. Alfredo Di Tillio & Marco Ottaviani & Peter Norman Sørensen, 2017. "Persuasion Bias in Science: Can Economics Help?," Economic Journal, Royal Economic Society, vol. 127(605), pages 266-304, October.
    12. Snyder, Christopher & Zhuo, Ran, 2018. "Sniff Tests in Economics: Aggregate Distribution of Their Probability Values and Implications for Publication Bias," MetaArXiv 8vdrh, Center for Open Science.
    13. Baum-Snow, Nathaniel & Ferreira, Fernando, 2015. "Causal Inference in Urban and Regional Economics," Handbook of Regional and Urban Economics, in: Gilles Duranton & J. V. Henderson & William C. Strange (ed.), Handbook of Regional and Urban Economics, edition 1, volume 5, chapter 0, pages 3-68, Elsevier.
    14. Naik, Gopal & Chitre, Chetan & Bhalla, Manaswini & Rajan, Jothsna, 2020. "Impact of use of technology on student learning outcomes: Evidence from a large-scale experiment in India," World Development, Elsevier, vol. 127(C).
    15. Peter Boone & Alex Eble & Diana Elbourne, 2013. "Risk and Evidence of Bias in Randomized Controlled Trials in Economics," CEP Discussion Papers dp1240, Centre for Economic Performance, LSE.
    16. Furukawa, Chishio, 2019. "Publication Bias under Aggregation Frictions: Theory, Evidence, and a New Correction Method," EconStor Preprints 194798, ZBW - Leibniz Information Centre for Economics.
    17. Opoku-Agyemang, Kweku A., 2017. "A Human-Computer Interaction Approach for Integrity in Economics," SocArXiv ra3cs, Center for Open Science.
    18. Stefano DellaVigna & Elizabeth Linos, 2022. "RCTs to Scale: Comprehensive Evidence From Two Nudge Units," Econometrica, Econometric Society, vol. 90(1), pages 81-116, January.
    19. Abhijit Banerjee & Rukmini Banerji & James Berry & Esther Duflo & Harini Kannan & Shobhini Mukerji & Marc Shotland & Michael Walton, 2017. "From Proof of Concept to Scalable Policies: Challenges and Solutions, with an Application," Journal of Economic Perspectives, American Economic Association, vol. 31(4), pages 73-102, Fall.
    20. Roman Horvath & Ali Elminejad & Tomas Havranek, 2020. "Publication and Identification Biases in Measuring the Intertemporal Substitution of Labor Supply," Working Papers IES 2020/32, Charles University Prague, Faculty of Social Sciences, Institute of Economic Studies, revised Sep 2020.

    More about this item

    JEL classification:

    • C90 - Mathematical and Quantitative Methods - - Design of Experiments - - - General
    • C93 - Mathematical and Quantitative Methods - - Design of Experiments - - - Field Experiments

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:oup:wbecrv:v:31:y:2017:i:3:p:687-707.. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Oxford University Press (email available below). General contact details of provider: https://edirc.repec.org/data/wrldbus.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.