IDEAS home Printed from https://ideas.repec.org/p/bri/cmpowp/15-337.html
   My bibliography  Save this paper

“Powered to Detect Small Effect Sizes”: You keep saying that. I do not think it means what you think it means

Author

Listed:
  • Michael Sanders
  • Aisling Ní Chonaire

Abstract

Randomised trials in education research are a valuable and increasingly common part of the research landscape. Choosing a sample size large enough to detect an effect but small enough to make the trial workable is a vital component. In the absence of a crystal ball, rules of thumb are often relied upon. In this paper, we offer criticism for commonly used rules of thumb and show that effect sizes that can be realistically expected in education research are much more modest than studies are powered to detect. This has important implications for future trials, which should arguably be larger, and for the interpretation of prior, underpowered research.

Suggested Citation

  • Michael Sanders & Aisling Ní Chonaire, 2015. "“Powered to Detect Small Effect Sizes”: You keep saying that. I do not think it means what you think it means," The Centre for Market and Public Organisation 15/337, The Centre for Market and Public Organisation, University of Bristol, UK.
  • Handle: RePEc:bri:cmpowp:15/337
    as

    Download full text from publisher

    File URL: http://www.bristol.ac.uk/media-library/sites/cmpo/documents/WP15337_Web_Version.pdf
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. John P A Ioannidis, 2005. "Why Most Published Research Findings Are False," PLOS Medicine, Public Library of Science, vol. 2(8), pages 1-1, August.
    2. Belot, Michèle & James, Jonathan, 2016. "Partner selection into policy relevant field experiments," Journal of Economic Behavior & Organization, Elsevier, vol. 123(C), pages 31-56.
    3. Belot, Michèle & James, Jonathan, 2014. "A new perspective on the issue of selection bias in randomized controlled field experiments," Economics Letters, Elsevier, vol. 124(3), pages 326-328.
    4. John A. List, 2011. "Why Economists Should Conduct Field Experiments and 14 Tips for Pulling One Off," Journal of Economic Perspectives, American Economic Association, vol. 25(3), pages 3-16, Summer.
    5. repec:feb:artefa:0110 is not listed on IDEAS
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Cotofan, Maria, 2021. "Learning from praise: Evidence from a field experiment with teachers," Journal of Public Economics, Elsevier, vol. 204(C).
    2. van Lent, Max & Souverijn, Michiel, 2020. "Goal setting and raising the bar: A field experiment," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 87(C).
    3. Max van Lent & Michiel Souverijn, 2017. "Goal Setting and Raising the Bar: A Field Experiment," Tinbergen Institute Discussion Papers 17-001/VII, Tinbergen Institute.
    4. Maria Cotofan, 2019. "Learning from Praise: Evidence from a Field Experiment with Teachers," Tinbergen Institute Discussion Papers 19-082/V, Tinbergen Institute.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    2. Belot, Michèle & James, Jonathan, 2016. "Partner selection into policy relevant field experiments," Journal of Economic Behavior & Organization, Elsevier, vol. 123(C), pages 31-56.
    3. Goda, Gopi Shah & Manchester, Colleen Flaherty & Sojourner, Aaron J., 2014. "What will my account really be worth? Experimental evidence on how retirement income projections affect saving," Journal of Public Economics, Elsevier, vol. 119(C), pages 80-92.
    4. Belot, Michèle & James, Jonathan, 2014. "A new perspective on the issue of selection bias in randomized controlled field experiments," Economics Letters, Elsevier, vol. 124(3), pages 326-328.
    5. Guido Friebel & Matthias Heinz & Miriam Krueger & Nikolay Zubanov, 2017. "Team Incentives and Performance: Evidence from a Retail Chain," American Economic Review, American Economic Association, vol. 107(8), pages 2168-2203, August.
    6. Alexander Frankel & Maximilian Kasy, 2022. "Which Findings Should Be Published?," American Economic Journal: Microeconomics, American Economic Association, vol. 14(1), pages 1-38, February.
    7. Jyotirmoy Sarkar, 2018. "Will P†Value Triumph over Abuses and Attacks?," Biostatistics and Biometrics Open Access Journal, Juniper Publishers Inc., vol. 7(4), pages 66-71, July.
    8. Pedro Carneiro & Sokbae Lee & Daniel Wilhelm, 2020. "Optimal data collection for randomized control trials [Microcredit impacts: Evidence from a randomized microcredit program placement experiment by Compartamos Banco]," The Econometrics Journal, Royal Economic Society, vol. 23(1), pages 1-31.
    9. Omar Al-Ubaydli & John List & Claire Mackevicius & Min Sok Lee & Dana Suskind, 2019. "How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling," Artefactual Field Experiments 00679, The Field Experiments Website.
    10. Naphtal Habiyaremye & Nadhem Mtimet & Emily A. Ouma & Gideon A. Obare, 2023. "Consumers' willingness to pay for safe and quality milk: Evidence from experimental auctions in Rwanda," Agribusiness, John Wiley & Sons, Ltd., vol. 39(4), pages 1049-1074, October.
    11. Stanley, T. D. & Doucouliagos, Chris, 2019. "Practical Significance, Meta-Analysis and the Credibility of Economics," IZA Discussion Papers 12458, Institute of Labor Economics (IZA).
    12. Karin Langenkamp & Bodo Rödel & Kerstin Taufenbach & Meike Weiland, 2018. "Open Access in Vocational Education and Training Research," Publications, MDPI, vol. 6(3), pages 1-12, July.
    13. Kevin J. Boyle & Mark Morrison & Darla Hatton MacDonald & Roderick Duncan & John Rose, 2016. "Investigating Internet and Mail Implementation of Stated-Preference Surveys While Controlling for Differences in Sample Frames," Environmental & Resource Economics, Springer;European Association of Environmental and Resource Economists, vol. 64(3), pages 401-419, July.
    14. Jelte M Wicherts & Marjan Bakker & Dylan Molenaar, 2011. "Willingness to Share Research Data Is Related to the Strength of the Evidence and the Quality of Reporting of Statistical Results," PLOS ONE, Public Library of Science, vol. 6(11), pages 1-7, November.
    15. Valentine, Kathrene D & Buchanan, Erin Michelle & Scofield, John E. & Beauchamp, Marshall T., 2017. "Beyond p-values: Utilizing Multiple Estimates to Evaluate Evidence," OSF Preprints 9hp7y, Center for Open Science.
    16. Anton, Roman, 2014. "Sustainable Intrapreneurship - The GSI Concept and Strategy - Unfolding Competitive Advantage via Fair Entrepreneurship," MPRA Paper 69713, University Library of Munich, Germany, revised 01 Feb 2015.
    17. Dudek, Thomas & Brenøe, Anne Ardila & Feld, Jan & Rohrer, Julia, 2022. "No Evidence That Siblings' Gender Affects Personality across Nine Countries," IZA Discussion Papers 15137, Institute of Labor Economics (IZA).
    18. Omar Al-Ubaydli & John List, 2016. "Field Experiments in Markets," Artefactual Field Experiments j0002, The Field Experiments Website.
    19. Uwe Hassler & Marc‐Oliver Pohle, 2022. "Unlucky Number 13? Manipulating Evidence Subject to Snooping," International Statistical Review, International Statistical Institute, vol. 90(2), pages 397-410, August.
    20. Frederique Bordignon, 2020. "Self-correction of science: a comparative study of negative citations and post-publication peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 124(2), pages 1225-1239, August.

    More about this item

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:bri:cmpowp:15/337. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: the person in charge (email available below). General contact details of provider: https://edirc.repec.org/data/cmbriuk.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.