IDEAS home Printed from https://ideas.repec.org/p/ucl/cepeow/23-07.html
   My bibliography  Save this paper

Experimental education research: rethinking why, how and when to use random assignment

Author

Listed:
  • Sam Sims

    (UCL Centre for Education Policy and Equaliising Opportunities, University College London)

  • Jake Anders

    (UCL Centre for Education Policy and Equaliising Opportunities, University College London)

  • Matthew Inglis

    (Centre for Mathematical Cognition, Loughborough University)

  • Hugues Lortie-Forgues

    (Centre for Mathematical Cognition, Loughborough University)

  • Ben Styles

    (NFER)

  • Ben Weidmann

    (Skills Lab, Harvard University)

Abstract

Over the last twenty years, education researchers have increasingly conducted randomised experiments with the goal of informing the decisions of educators and policymakers. Such experiments have generally employed broad, consequential, standardised outcome measures in the hope that this would allow decisionmakers to compare effectiveness of different approaches. However, a combination of small effect sizes, wide confidence intervals, and treatment effect heterogeneity means that researchers have largely failed to achieve this goal. We argue that quasi-experimental methods and multi-site trials will often be superior for informing educators' decisions on the grounds that they can achieve greater precision and better address heterogeneity. Experimental research remains valuable in applied education research. However, it should primarily be used to test theoretical models, which can in turn inform educators' mental models, rather than attempting to directly inform decision making. Since comparable effect size estimates are not of interest when testing educational theory, researchers can and should improve the power of theory-informing experiments by using more closely aligned (i.e., valid) outcome measures. We argue that this approach would reduce wasteful research spending and make the research that does go ahead more statistically informative, thus improving the return on investment in educational research.

Suggested Citation

  • Sam Sims & Jake Anders & Matthew Inglis & Hugues Lortie-Forgues & Ben Styles & Ben Weidmann, 2023. "Experimental education research: rethinking why, how and when to use random assignment," CEPEO Working Paper Series 23-07, UCL Centre for Education Policy and Equalising Opportunities, revised Aug 2023.
  • Handle: RePEc:ucl:cepeow:23-07
    as

    Download full text from publisher

    File URL: https://repec-cepeo.ucl.ac.uk/cepeow/cepeowp23-07r1.pdf
    File Function: Revised version, 2023
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Sam Sims & Harry Fletcher-Wood & Alison O'Mara-Eves & Sarah Cottingham & Claire Stansfield & Josh Goodrich & Jo Van Herwegen & Jake Anders, 2022. "Effective teacher professional development: new theory and a meta-analytic test," CEPEO Working Paper Series 22-02, UCL Centre for Education Policy and Equalising Opportunities, revised Jan 2022.
    2. Brodeur, Abel & Cook, Nikolai & Hartley, Jonathan & Heyes, Anthony, 2022. "Do Pre-Registration and Pre-analysis Plans Reduce p-Hacking and Publication Bias?," MetaArXiv uxf39, Center for Open Science.
    3. repec:mpr:mprres:2511 is not listed on IDEAS
    4. Duncan D. Chaplin & Thomas D. Cook & Jelena Zurovac & Jared S. Coopersmith & Mariel M. Finucane & Lauren N. Vollmer & Rebecca E. Morris, 2018. "The Internal And External Validity Of The Regression Discontinuity Design: A Meta‐Analysis Of 15 Within‐Study Comparisons," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 37(2), pages 403-429, March.
    5. Ben Ost & Anuj Gangopadhyaya & Jeffrey C. Schiman, 2017. "Comparing standard deviation effects across contexts," Education Economics, Taylor & Francis Journals, vol. 25(3), pages 251-265, May.
    6. Deaton, Angus & Cartwright, Nancy, 2018. "Understanding and misunderstanding randomized controlled trials," Social Science & Medicine, Elsevier, vol. 210(C), pages 2-21.
    7. Luke Keele & Corrine McConnaughy & Ismail White, 2012. "Strengthening the Experimenter’s Toolbox: Statistical Estimation of Internal Validity," American Journal of Political Science, John Wiley & Sons, vol. 56(2), pages 484-499, April.
    8. Alberto Abadie & Susan Athey & Guido W. Imbens & Jeffrey M. Wooldridge, 2020. "Sampling‐Based versus Design‐Based Uncertainty in Regression Analysis," Econometrica, Econometric Society, vol. 88(1), pages 265-296, January.
    9. Ben Weidmann & Luke Miratrix, 2021. "Lurking Inferential Monsters? Quantifying Selection Bias In Evaluations Of School Programs," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 40(3), pages 964-986, June.
    10. Jared Coopersmith & Thomas D. Cook & Jelena Zurovac & Duncan Chaplin & Lauren V. Forrow, 2022. "Internal And External Validity Of The Comparative Interrupted Time‐Series Design: A Meta‐Analysis," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 41(1), pages 252-277, January.
    11. Yew-Kwang Ng, 2003. "From preference to happiness: Towards a more complete welfare economics," Social Choice and Welfare, Springer;The Society for Social Choice and Welfare, vol. 20(2), pages 307-350, March.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Sam Sims & Harry Fletcher-Wood & Thomas Godfrey-Faussett & Peps Mccrea & Stefanie Meliss, 2023. "Modelling evidence-based practice in initial teacher training: causal effects on teachers' skills, knowledge and self-efficacy," CEPEO Working Paper Series 23-09, UCL Centre for Education Policy and Equalising Opportunities, revised Aug 2023.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Denis Fougère & Nicolas Jacquemet, 2020. "Policy Evaluation Using Causal Inference Methods," SciencePo Working papers Main hal-03455978, HAL.
    2. Perrotta, Manuela & Geampana, Alina, 2020. "The trouble with IVF and randomised control trials: Professional legitimation narratives on time-lapse imaging and evidence-informed care," Social Science & Medicine, Elsevier, vol. 258(C).
    3. Christopher J. Ruhm, 2019. "Shackling the Identification Police?," Southern Economic Journal, John Wiley & Sons, vol. 85(4), pages 1016-1026, April.
    4. Omar Al-Ubaydli & John List & Claire Mackevicius & Min Sok Lee & Dana Suskind, 2019. "How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling," Artefactual Field Experiments 00679, The Field Experiments Website.
    5. Praveen Ranjan Srivastava & Prajwal Eachempati & Ajay Kumar & Ashish Kumar Jha & Lalitha Dhamotharan, 2023. "Best strategy to win a match: an analytical approach using hybrid machine learning-clustering-association rule framework," Annals of Operations Research, Springer, vol. 325(1), pages 319-361, June.
    6. Martin, Will, 2021. "Tools for measuring the full impacts of agricultural interventions," IFPRI-MCC technical papers 2, International Food Policy Research Institute (IFPRI).
    7. Alberini, Anna & Bezhanishvili, Levan & Ščasný, Milan, 2022. "“Wild” tariff schemes: Evidence from the Republic of Georgia," Energy Economics, Elsevier, vol. 110(C).
    8. Kabeer, Naila, 2020. "‘Misbehaving’ RCTs: The confounding problem of human agency," World Development, Elsevier, vol. 127(C).
    9. Bellés Obrero, Cristina & Lombardi, María, 2019. "Teacher Performance Pay and Student Learning: Evidence from a Nationwide Program in Peru," IZA Discussion Papers 12600, Institute of Labor Economics (IZA).
    10. Sophie van Huellen & Duo Qin, 2019. "Compulsory Schooling and Returns to Education: A Re-Examination," Econometrics, MDPI, vol. 7(3), pages 1-20, September.
    11. Gabriel Leite Mota, 2022. "Unsatisfying ordinalism: The breach through which happiness (re)entered economics," Regional Science Policy & Practice, Wiley Blackwell, vol. 14(3), pages 513-528, June.
    12. Bellocca, Gian Pietro Enzo & Alessi, Lucia & Poncela Blanco, Maria Pilar & Ruiz Ortega, Esther, 2023. "Effects of extreme temperature on the European equity market," DES - Working Papers. Statistics and Econometrics. WS 37973, Universidad Carlos III de Madrid. Departamento de Estadística.
    13. Sloczynski, Tymon, 2020. "Interpreting OLS Estimands When Treatment Effects Are Heterogeneous: Smaller Groups Get Larger Weights," IZA Discussion Papers 13283, Institute of Labor Economics (IZA).
    14. Marchionni, Caterina & Reijula, Samuli, 2018. "What is mechanistic evidence, and why do we need it for evidence-based policy?," SocArXiv 4ufbm, Center for Open Science.
    15. Alex Hollingsworth & Mike Huang & Ivan J. Rudik & Nicholas J. Sanders, 2020. "A Thousand Cuts: Cumulative Lead Exposure Reduces Academic Achievement," NBER Working Papers 28250, National Bureau of Economic Research, Inc.
    16. Ingebjørg Kristoffersen, 2010. "The Metrics of Subjective Wellbeing: Cardinality, Neutrality and Additivity," The Economic Record, The Economic Society of Australia, vol. 86(272), pages 98-123, March.
    17. Benno Torgler & Sascha L. Schmidt & Bruno S. Frey, 2006. "The Power of Positional Concerns: A Panel Analysis," CREMA Working Paper Series 2006-19, Center for Research in Economics, Management and the Arts (CREMA).
    18. Magne Mogstad & Joseph P. Romano & Azeem Shaikh & Daniel Wilhelm, 2020. "Inference for Ranks with Applications to Mobility across Neighborhoods and Academic Achievement across Countries," NBER Working Papers 26883, National Bureau of Economic Research, Inc.
    19. Ashesh Rambachan & Jonathan Roth, 2020. "Design-Based Uncertainty for Quasi-Experiments," Papers 2008.00602, arXiv.org, revised Feb 2024.
    20. Bayer, Patrick & Kennedy, Ryan & Yang, Joonseok & Urpelainen, Johannes, 2020. "The need for impact evaluation in electricity access research," Energy Policy, Elsevier, vol. 137(C).

    More about this item

    Keywords

    randomized controlled trials; education; research; experiments; policy;
    All these keywords.

    JEL classification:

    • I20 - Health, Education, and Welfare - - Education - - - General
    • I21 - Health, Education, and Welfare - - Education - - - Analysis of Education
    • C90 - Mathematical and Quantitative Methods - - Design of Experiments - - - General
    • C93 - Mathematical and Quantitative Methods - - Design of Experiments - - - Field Experiments

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:ucl:cepeow:23-07. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Jake Anders (email available below). General contact details of provider: https://edirc.repec.org/data/epucluk.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.