IDEAS home Printed from https://ideas.repec.org/p/cep/cepdps/dp1240.html
   My bibliography  Save this paper

Risk and Evidence of Bias in Randomized Controlled Trials in Economics

Author

Listed:
  • Peter Boone
  • Alex Eble
  • Diana Elbourne

Abstract

The randomized controlled trial (RCT) has been a heavily utilized research tool in medicine for over 60 years. Since the early 2000's, large-scale RCTs have been used in increasingly large numbers in the social sciences to evaluate questions of both policy and theory. The early economics literature on RCTs invokes the medical literature, but seems to ignore a large body of this literature which studies the past mistakes of medical trialists and links poor trial design, conduct and reporting to exaggerated estimates of treatment effects. Using a few consensus documents on these issues from the medical literature, we design a tool to evaluate adequacy of reporting and risk of bias in RCT reports. We then use this tool to evaluate 54 reports of RCTs published in a set of 52 major economics journals between 2001 and 2011 alongside a sample of reports of 54 RCTs published in medical journals over the same time period. We find that economics RCTs fall far short of the recommendations for reporting and conduct put forth in the medical literature, while medical trials stick fairly close to them, suggesting risk of exaggerated treatment effects in the economics literature.

Suggested Citation

  • Peter Boone & Alex Eble & Diana Elbourne, 2013. "Risk and Evidence of Bias in Randomized Controlled Trials in Economics," CEP Discussion Papers dp1240, Centre for Economic Performance, LSE.
  • Handle: RePEc:cep:cepdps:dp1240
    as

    Download full text from publisher

    File URL: https://cep.lse.ac.uk/pubs/download/dp1240.pdf
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Abhijit Vinayak Banerjee & Alice H. Amsden & Robert H. Bates & Jagdish Bhagwati & Angus Deaton & Nicholas Stern, 2007. "Making Aid Work," MIT Press Books, The MIT Press, edition 1, volume 1, number 0262026155, December.
    2. Angus Deaton, 2009. "Instruments of development: Randomization in the tropics, and the search for the elusive keys to economic development," Working Papers 1128, Princeton University, Woodrow Wilson School of Public and International Affairs, Center for Health and Wellbeing..
    3. Guido W. Imbens, 2010. "Better LATE Than Nothing: Some Comments on Deaton (2009) and Heckman and Urzua (2009)," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 399-423, June.
    4. repec:pri:cheawb:deaton%20instruments%20of%20development%20keynes%20lecture%202009.pdf is not listed on IDEAS
    5. repec:pri:rpdevs:instruments_of_development.pdf is not listed on IDEAS
    6. Kodrzycki Yolanda K. & Yu Pingkang, 2006. "New Approaches to Ranking Economics Journals," The B.E. Journal of Economic Analysis & Policy, De Gruyter, vol. 5(1), pages 1-44, August.
    7. Kenneth F Schulz & Douglas G Altman & David Moher & for the CONSORT Group, 2010. "CONSORT 2010 Statement: Updated Guidelines for Reporting Parallel Group Randomised Trials," PLOS Medicine, Public Library of Science, vol. 7(3), pages 1-7, March.
    8. Danzon, Patricia M. & Nicholson, Sean & Pereira, Nuno Sousa, 2005. "Productivity in pharmaceutical-biotechnology R&D: the role of experience and alliances," Journal of Health Economics, Elsevier, vol. 24(2), pages 317-339, March.
    9. Katherine Casey & Rachel Glennerster & Edward Miguel, 2012. "Reshaping Institutions: Evidence on Aid Impacts Using a Preanalysis Plan," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 127(4), pages 1755-1812.
    10. J. L. Hutton & Paula R. Williamson, 2000. "Bias in meta‐analysis due to outcome variable selection within studies," Journal of the Royal Statistical Society Series C, Royal Statistical Society, vol. 49(3), pages 359-370.
    11. Yolanda Kodrzycki & Pingkang David Yu, 2005. "New approaches to ranking economics journals," Working Papers 05-12, Federal Reserve Bank of Boston.
    12. repec:pri:cheawb:deaton%20instruments%20of%20development%20keynes%20lecture%202009 is not listed on IDEAS
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Florent Bédécarrats & Isabelle Guérin & François Roubaud, 2019. "All that Glitters is not Gold. The Political Economy of Randomized Evaluations in Development," Development and Change, International Institute of Social Studies, vol. 50(3), pages 735-762, May.
    2. Florent BEDECARRATS & Isabelle GUERIN & François ROUBAUD, 2017. "L'étalon-or des évaluations randomisées : économie politique des expérimentations aléatoires dans le domaine du développement," Working Paper 753120cd-506f-4c5f-80ed-7, Agence française de développement.
    3. McHugh, Neil & Biosca, Olga & Donaldson, Cam, 2015. "Microfinance, health and randomised trials," Health Economics Working Paper Series 201501, Glasgow Caledonian University, Yunus Centre.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Alex Eble & Peter Boone & Diana Elbourne, 2017. "On Minimizing the Risk of Bias in Randomized Controlled Trials in Economics," The World Bank Economic Review, World Bank, vol. 31(3), pages 687-707.
    2. Sophie Webber, 2015. "Randomising Development: Geography, Economics and the Search for Scientific Rigour," Tijdschrift voor Economische en Sociale Geografie, Royal Dutch Geographical Society KNAG, vol. 106(1), pages 36-52, February.
    3. Jörg Peters & Jörg Langbein & Gareth Roberts, 2018. "Generalization in the Tropics – Development Policy, Randomized Controlled Trials, and External Validity," The World Bank Research Observer, World Bank, vol. 33(1), pages 34-64.
    4. Steven F. Lehrer & R. Vincent Pohl & Kyungchul Song, 2016. "Targeting Policies: Multiple Testing and Distributional Treatment Effects," NBER Working Papers 22950, National Bureau of Economic Research, Inc.
    5. Sophie van Huellen & Duo Qin, 2019. "Compulsory Schooling and Returns to Education: A Re-Examination," Econometrics, MDPI, vol. 7(3), pages 1-20, September.
    6. David L. Anderson & John Tressler, 2009. "The Excellence in Research for Australia Scheme: An Evaluation of the Draft Journal Weights for Economics," Working Papers in Economics 09/07, University of Waikato.
    7. Joshua D. Angrist & Jörn-Steffen Pischke, 2010. "The Credibility Revolution in Empirical Economics: How Better Research Design Is Taking the Con out of Econometrics," Journal of Economic Perspectives, American Economic Association, vol. 24(2), pages 3-30, Spring.
    8. Gisselquist, Rachel & Niño-Zarazúa, Miguel, 2013. "What can experiments tell us about how to improve governance?," MPRA Paper 49300, University Library of Munich, Germany.
    9. Peters, Jörg & Langbein, Jörg & Roberts, Gareth, 2016. "Policy evaluation, randomized controlled trials, and external validity—A systematic review," Economics Letters, Elsevier, vol. 147(C), pages 51-54.
    10. Gisselquist, Rachel & Niño-Zarazúa, Miguel, 2013. "What can experiments tell us about how to improve governance?," MPRA Paper 49300, University Library of Munich, Germany.
    11. Sara Nadel and Lant Pritchett, 2016. "Searching for the Devil in the Details: Learning about Development Program Design," Working Papers 434, Center for Global Development.
    12. Nancy Qian, 2015. "Making Progress on Foreign Aid," Annual Review of Economics, Annual Reviews, vol. 7(1), pages 277-308, August.
    13. Victor Ginsburgh & Olivier Gergaud, 2013. "Measuring the effect of cultural events with special emphasis on music festivals," ULB Institutional Repository 2013/152437, ULB -- Universite Libre de Bruxelles.
    14. James H. Stock, 2010. "The Other Transformation in Econometric Practice: Robust Tools for Inference," Journal of Economic Perspectives, American Economic Association, vol. 24(2), pages 83-94, Spring.
    15. Dan Ben-David, 2010. "Ranking Israel’s economists," Scientometrics, Springer;Akadémiai Kiadó, vol. 82(2), pages 351-364, February.
    16. María Alzúa & Guillermo Cruces & Laura Ripani, 2013. "Welfare programs and labor supply in developing countries: experimental evidence from Latin America," Journal of Population Economics, Springer;European Society for Population Economics, vol. 26(4), pages 1255-1284, October.
    17. Raj Chetty, 2009. "Sufficient Statistics for Welfare Analysis: A Bridge Between Structural and Reduced-Form Methods," Annual Review of Economics, Annual Reviews, vol. 1(1), pages 451-488, May.
    18. Ana Rute Cardoso & Paulo Guimarães & Klaus F. Zimmermann, 2010. "Trends in Economic Research: An International Perspective," Kyklos, Wiley Blackwell, vol. 63(4), pages 479-494, November.
    19. Corduneanu-Huci, Cristina & Dorsch, Michael T. & Maarek, Paul, 2021. "The politics of experimentation: Political competition and randomized controlled trials," Journal of Comparative Economics, Elsevier, vol. 49(1), pages 1-21.
    20. Stefanie Behncke, 2009. "How Does Retirement Affect Health?," University of St. Gallen Department of Economics working paper series 2009 2009-13, Department of Economics, University of St. Gallen.

    More about this item

    Keywords

    randomized controlled trials; field experiments; bias; treatment effect estimates;
    All these keywords.

    JEL classification:

    • C9 - Mathematical and Quantitative Methods - - Design of Experiments
    • C90 - Mathematical and Quantitative Methods - - Design of Experiments - - - General
    • C93 - Mathematical and Quantitative Methods - - Design of Experiments - - - Field Experiments
    • C10 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - General
    • C12 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Hypothesis Testing: General
    • C18 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Methodolical Issues: General

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:cep:cepdps:dp1240. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: the person in charge (email available below). General contact details of provider: https://cep.lse.ac.uk/_new/publications/discussion-papers/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.