Risk and Evidence of Bias in Randomized Controlled Trials in Economics
AbstractThe randomized controlled trial (RCT) has been a heavily utilized research tool in medicine for over 60 years. Since the early 2000's, large-scale RCTs have been used in increasingly large numbers in the social sciences to evaluate questions of both policy and theory. The early economics literature on RCTs invokes the medical literature, but seems to ignore a large body of this literature which studies the past mistakes of medical trialists and links poor trial design, conduct and reporting to exaggerated estimates of treatment effects. Using a few consensus documents on these issues from the medical literature, we design a tool to evaluate adequacy of reporting and risk of bias in RCT reports. We then use this tool to evaluate 54 reports of RCTs published in a set of 52 major economics journals between 2001 and 2011 alongside a sample of reports of 54 RCTs published in medical journals over the same time period. We find that economics RCTs fall far short of the recommendations for reporting and conduct put forth in the medical literature, while medical trials stick fairly close to them, suggesting risk of exaggerated treatment effects in the economics literature.
Download InfoIf you experience problems downloading a file, check if you have the proper application to view it first. In case of further problems read the IDEAS help page. Note that these files are not on the IDEAS site. Please be patient as the files may be large.
Bibliographic InfoPaper provided by Centre for Economic Performance, LSE in its series CEP Discussion Papers with number dp1240.
Date of creation: Sep 2013
Date of revision:
Contact details of provider:
Web page: http://cep.lse.ac.uk/_new/publications/series.asp?prog=CEP
randomized controlled trials; field experiments; bias; treatment effect estimates;
Find related papers by JEL classification:
- C9 - Mathematical and Quantitative Methods - - Design of Experiments
- C90 - Mathematical and Quantitative Methods - - Design of Experiments - - - General
- C93 - Mathematical and Quantitative Methods - - Design of Experiments - - - Field Experiments
- C10 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - General
- C12 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Hypothesis Testing: General
- C18 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Methodolical Issues: General
Please report citation or reference errors to , or , if you are the registered author of the cited work, log in to your RePEc Author Service profile, click on "citations" and make appropriate adjustments.:
- Abhijit Vinayak Banerjee & Alice H. Amsden & Robert H. Bates & Jagdish Bhagwati & Angus Deaton & Nicholas Stern, 2007. "Making Aid Work," MIT Press Books, The MIT Press, edition 1, volume 1, number 0262026155, June.
- Kodrzycki Yolanda K. & Yu Pingkang, 2006. "New Approaches to Ranking Economics Journals," The B.E. Journal of Economic Analysis & Policy, De Gruyter, vol. 5(1), pages 1-44, August.
- Katherine Casey & Rachel Glennerster & Edward Miguel, 2011.
"Reshaping Institutions: Evidence on Aid Impacts Using a Pre-Analysis Plan,"
NBER Working Papers
17012, National Bureau of Economic Research, Inc.
- Katherine Casey & Rachel Glennerster & Edward Miguel, 2012. "Reshaping Institutions: Evidence on Aid Impacts Using a Preanalysis Plan," The Quarterly Journal of Economics, Oxford University Press, vol. 127(4), pages 1755-1812.
- repec:pri:rpdevs:1122 is not listed on IDEAS
- repec:pri:cheawb:1128 is not listed on IDEAS
- J. L. Hutton & Paula R. Williamson, 2000. "Bias in meta-analysis due to outcome variable selection within studies," Journal of the Royal Statistical Society Series C, Royal Statistical Society, vol. 49(3), pages 359-370.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: ().
If references are entirely missing, you can add them using this form.