IDEAS home Printed from https://ideas.repec.org/p/ucl/cepeow/20-16.html
   My bibliography  Save this paper

Quantifying 'promising trials bias' in randomized controlled trials in education

Author

Listed:
  • Sam Sims

    (Centre for Education Policy and Equaliising Opportunities, UCL Institute of Education, University College London)

  • Jake Anders

    (Centre for Education Policy and Equaliising Opportunities, UCL Institute of Education, University College London)

  • Matthew Inglis

    (Centre for Mathematical Cognition, Loughborough University)

  • Hugues Lortie-Forgues

    (Centre for Mathematical Cognition, Loughborough University)

Abstract

Randomized controlled trials have proliferated in education, in part because they provide an unbiased estimator for the causal impact of interventions. It is increasingly recognized that many such trials in education have low power to detect an effect, if indeed there is one. However, it is less well known that low powered trials tend to systematically exaggerate effect sizes among the subset of interventions that show promising results. We conduct a retrospective design analysis to quantify this bias across 23 promising trials, finding that the estimated effect sizes are exaggerated by an average of 52% or more. Promising trials bias can be reduced ex-ante by increasing the power of the trials that are commissioned and guarded against ex-post by including estimates of the exaggeration ratio when reporting trial findings. Our results also suggest that challenges around implementation fidelity are not the only reason that apparently successful interventions often fail to subsequently scale up. Instead, the findings from the initial promising trial may simply have been exaggerated.Length: 19 pages

Suggested Citation

  • Sam Sims & Jake Anders & Matthew Inglis & Hugues Lortie-Forgues, 2020. "Quantifying 'promising trials bias' in randomized controlled trials in education," CEPEO Working Paper Series 20-16, UCL Centre for Education Policy and Equalising Opportunities, revised Nov 2020.
  • Handle: RePEc:ucl:cepeow:20-16
    as

    Download full text from publisher

    File URL: https://repec-cepeo.ucl.ac.uk/cepeow/cepeowp20-16.pdf
    File Function: First version, 2020
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Mayo, Deborah & Morey, Richard Donald, 2017. "A Poor Prognosis for the Diagnostic Screening Critique of Statistical Tests," OSF Preprints ps38b, Center for Open Science.
    2. David Colquhoun, 2019. "The False Positive Risk: A Proposal Concerning What to Do About p-Values," The American Statistician, Taylor & Francis Journals, vol. 73(S1), pages 192-201, March.
    3. Matthew A. Kraft & Manuel Monti-Nussbaum, 2017. "Can Schools Enable Parents to Prevent Summer Learning Loss? A Text-Messaging Field Experiment to Promote Literacy Skills," The ANNALS of the American Academy of Political and Social Science, , vol. 674(1), pages 85-112, November.
    4. Kosuke Imai & Gary King & Elizabeth A. Stuart, 2008. "Misunderstandings between experimentalists and observationalists about causal inference," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 171(2), pages 481-502, April.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Damgaard, Mette Trier & Nielsen, Helena Skyt, 2018. "Nudging in education," Economics of Education Review, Elsevier, vol. 64(C), pages 313-342.
    2. Marco Caliendo & Stefan Tübbicke, 2020. "New evidence on long-term effects of start-up subsidies: matching estimates and their robustness," Empirical Economics, Springer, vol. 59(4), pages 1605-1631, October.
    3. Wendy Chan, 2018. "Applications of Small Area Estimation to Generalization With Subclassification by Propensity Scores," Journal of Educational and Behavioral Statistics, , vol. 43(2), pages 182-224, April.
    4. Emily Beam & Priya Mukherjee & Laia Navarro-Sola, 2022. "Lowering Barriers to Remote Education: Experimental Impacts on Parental Responses and Learning," Working Papers 2022-030, Human Capital and Economic Opportunity Working Group.
    5. Jasjeet Singh Sekhon & Richard D. Grieve, 2012. "A matching method for improving covariate balance in cost‐effectiveness analyses," Health Economics, John Wiley & Sons, Ltd., vol. 21(6), pages 695-714, June.
    6. Solomon Asfaw & Silvio Daidone & Benjamin Davis & Josh Dewbre & Alessandro Romeo & Paul Winters & Katia Covarrubias & Habiba Djebbari, 2012. "Analytical Framework for Evaluating the Productive Impact of Cash Transfer Programmes on Household Behaviour – Methodological Guidelines for the From Protection to Production Project," Working Papers 101, International Policy Centre for Inclusive Growth.
    7. Weneyam Hippolyte Balima & Jean-Louis Combes & Alexandru Minea, 2015. "Sovereign Debt Risk in Emerging Countries: Does Inflation Targeting Adoption Make Any Difference?," CERDI Working papers halshs-01128239, HAL.
    8. Shen, Chung-Hua & Wu, Meng-Wen & Chen, Ting-Hsuan & Fang, Hao, 2016. "To engage or not to engage in corporate social responsibility: Empirical evidence from global banking sector," Economic Modelling, Elsevier, vol. 55(C), pages 207-225.
    9. Mareike Heimeshoff & Jonas Schreyögg & Oliver Tiemann, 2014. "Employment effects of hospital privatization in Germany," The European Journal of Health Economics, Springer;Deutsche Gesellschaft für Gesundheitsökonomie (DGGÖ), vol. 15(7), pages 747-757, September.
    10. James B. Kau & Lu Fang & Henry J. Munneke, 2019. "An Unintended Consequence of Mortgage Financing Regulation – a Racial Disparity," The Journal of Real Estate Finance and Economics, Springer, vol. 59(4), pages 549-588, November.
    11. Srhoj, Stjepan & Walde, Janette, 2020. "Getting ready for EU Single Market: The effect of export-oriented grant schemes on firm performance," Structural Change and Economic Dynamics, Elsevier, vol. 52(C), pages 279-293.
    12. Berta, Paolo & Callea, Giuditta & Martini, Gianmaria & Vittadini, Giorgio, 2010. "The effects of upcoding, cream skimming and readmissions on the Italian hospitals efficiency: A population-based investigation," Economic Modelling, Elsevier, vol. 27(4), pages 812-821, July.
    13. Lisa van der Sande & Ilona Wildeman & Adriana G. Bus & Roel van Steensel, 2023. "Nudging to Stimulate Reading in Primary and Secondary Education," SAGE Open, , vol. 13(2), pages 21582440231, April.
    14. Denis Fougère & Nicolas Jacquemet, 2020. "Policy Evaluation Using Causal Inference Methods," SciencePo Working papers Main hal-03455978, HAL.
    15. Maurizio Canavari & Andreas C. Drichoutis & Jayson L. Lusk & Rodolfo M. Nayga, Jr., 2018. "How to run an experimental auction: A review of recent advances," Working Papers 2018-5, Agricultural University of Athens, Department Of Agricultural Economics.
    16. Moreno-Serra R, "undated". "Health Programme Evaluation by Propensity Score Matching: Accounting for Treatment Intensity and Health Externalities with an Application to Brazil," Health, Econometrics and Data Group (HEDG) Working Papers 09/05, HEDG, c/o Department of Economics, University of York.
    17. Tingting Zhou & Michael R. Elliott & Roderick J. A. Little, 2022. "Addressing Disparities in the Propensity Score Distributions for Treatment Comparisons from Observational Studies," Stats, MDPI, vol. 5(4), pages 1-17, December.
    18. Matthew Blackwell & Stefano Iacus & Gary King & Giuseppe Porro, 2009. "cem: Coarsened exact matching in Stata," Stata Journal, StataCorp LP, vol. 9(4), pages 524-546, December.
    19. Arndt R. Reichert, 2015. "Obesity, Weight Loss, and Employment Prospects: Evidence from a Randomized Trial," Journal of Human Resources, University of Wisconsin Press, vol. 50(3), pages 759-810.
    20. Luca Zanin & Rosalba Radice & Giampiero Marra, 2013. "Estimating the Effect of Perceived Risk of Crime on Social Trust in the Presence of Endogeneity Bias," Social Indicators Research: An International and Interdisciplinary Journal for Quality-of-Life Measurement, Springer, vol. 114(2), pages 523-547, November.

    More about this item

    Keywords

    randomized controlled trials; education; promising trials bias;
    All these keywords.

    JEL classification:

    • I20 - Health, Education, and Welfare - - Education - - - General
    • I21 - Health, Education, and Welfare - - Education - - - Analysis of Education
    • C90 - Mathematical and Quantitative Methods - - Design of Experiments - - - General
    • C93 - Mathematical and Quantitative Methods - - Design of Experiments - - - Field Experiments

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:ucl:cepeow:20-16. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Jake Anders (email available below). General contact details of provider: https://edirc.repec.org/data/epucluk.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.