IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0236079.html
   My bibliography  Save this article

Recommendations in pre-registrations and internal review board proposals promote formal power analyses but do not increase sample size

Author

Listed:
  • Marjan Bakker
  • Coosje L S Veldkamp
  • Olmo R van den Akker
  • Marcel A L M van Assen
  • Elise Crompvoets
  • How Hwee Ong
  • Jelte M Wicherts

Abstract

In this preregistered study, we investigated whether the statistical power of a study is higher when researchers are asked to make a formal power analysis before collecting data. We compared the sample size descriptions from two sources: (i) a sample of pre-registrations created according to the guidelines for the Center for Open Science Preregistration Challenge (PCRs) and a sample of institutional review board (IRB) proposals from Tilburg School of Behavior and Social Sciences, which both include a recommendation to do a formal power analysis, and (ii) a sample of pre-registrations created according to the guidelines for Open Science Framework Standard Pre-Data Collection Registrations (SPRs) in which no guidance on sample size planning is given. We found that PCRs and IRBs (72%) more often included sample size decisions based on power analyses than the SPRs (45%). However, this did not result in larger planned sample sizes. The determined sample size of the PCRs and IRB proposals (Md = 90.50) was not higher than the determined sample size of the SPRs (Md = 126.00; W = 3389.5, p = 0.936). Typically, power analyses in the registrations were conducted with G*power, assuming a medium effect size, α = .05 and a power of .80. Only 20% of the power analyses contained enough information to fully reproduce the results and only 62% of these power analyses pertained to the main hypothesis test in the pre-registration. Therefore, we see ample room for improvements in the quality of the registrations and we offer several recommendations to do so.

Suggested Citation

  • Marjan Bakker & Coosje L S Veldkamp & Olmo R van den Akker & Marcel A L M van Assen & Elise Crompvoets & How Hwee Ong & Jelte M Wicherts, 2020. "Recommendations in pre-registrations and internal review board proposals promote formal power analyses but do not increase sample size," PLOS ONE, Public Library of Science, vol. 15(7), pages 1-15, July.
  • Handle: RePEc:plo:pone00:0236079
    DOI: 10.1371/journal.pone.0236079
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0236079
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0236079&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0236079?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Daniele Fanelli, 2010. "“Positive” Results Increase Down the Hierarchy of the Sciences," PLOS ONE, Public Library of Science, vol. 5(4), pages 1-10, April.
    2. Colin F. Camerer & Anna Dreber & Felix Holzmeister & Teck-Hua Ho & Jürgen Huber & Magnus Johannesson & Michael Kirchler & Gideon Nave & Brian A. Nosek & Thomas Pfeiffer & Adam Altmejd & Nick Buttrick , 2018. "Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015," Nature Human Behaviour, Nature, vol. 2(9), pages 637-644, September.
    3. Nosek, Brian A. & Ebersole, Charles R. & DeHaven, Alexander Carl & Mellor, David Thomas, 2018. "The Preregistration Revolution," OSF Preprints 2dxu5, Center for Open Science.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Giorgi, Francesca, 2021. "A new method to explore inferential risks associated with each study in a meta-analysis: An approach based on Design Analysis," Thesis Commons n5y8b, Center for Open Science.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    2. Shaw, Steven D. & Nave, Gideon, 2023. "Don't hate the player, hate the game: Realigning incentive structures to promote robust science and better scientific practices in marketing," Journal of Business Research, Elsevier, vol. 167(C).
    3. Chin, Jason & Zeiler, Kathryn, 2021. "Replicability in Empirical Legal Research," LawArXiv 2b5k4, Center for Open Science.
    4. Brinkerink, Jasper & De Massis, Alfredo & Kellermanns, Franz, 2022. "One finding is no finding: Toward a replication culture in family business research," Journal of Family Business Strategy, Elsevier, vol. 13(4).
    5. Tierney, Warren & Hardy, Jay H. & Ebersole, Charles R. & Leavitt, Keith & Viganola, Domenico & Clemente, Elena Giulia & Gordon, Michael & Dreber, Anna & Johannesson, Magnus & Pfeiffer, Thomas & Uhlman, 2020. "Creative destruction in science," Organizational Behavior and Human Decision Processes, Elsevier, vol. 161(C), pages 291-309.
    6. Sébastien Duchêne & Adrien Nguyen-Huu & Dimitri Dubois & Marc Willinger, 2022. "Risk-return trade-offs in the context of environmental impact: a lab-in-the-field experiment with finance professionals," CEE-M Working Papers hal-03883121, CEE-M, Universtiy of Montpellier, CNRS, INRA, Montpellier SupAgro.
    7. Strømland, Eirik, 2019. "Preregistration and reproducibility," Journal of Economic Psychology, Elsevier, vol. 75(PA).
    8. Nosek, Brian A. & Errington, Timothy M., 2019. "What is replication?," MetaArXiv u4g6t, Center for Open Science.
    9. Cantone, Giulio Giacomo, 2023. "The multiversal methodology as a remedy of the replication crisis," MetaArXiv kuhmz, Center for Open Science.
    10. Logg, Jennifer M. & Dorison, Charles A., 2021. "Pre-registration: Weighing costs and benefits for researchers," Organizational Behavior and Human Decision Processes, Elsevier, vol. 167(C), pages 18-27.
    11. Adler, Susanne Jana & Röseler, Lukas & Schöniger, Martina Katharina, 2023. "A toolbox to evaluate the trustworthiness of published findings," Journal of Business Research, Elsevier, vol. 167(C).
    12. Alós-Ferrer, Carlos & Garagnani, Michele, 2020. "The cognitive foundations of cooperation," Journal of Economic Behavior & Organization, Elsevier, vol. 175(C), pages 71-85.
    13. Brice Corgnet & Cary Deck & Mark Desantis & Kyle Hampton & Erik O Kimbrough, 2019. "Reconsidering Rational Expectations and the Aggregation of Diverse Information in Laboratory Security Markets," Working Papers halshs-02146611, HAL.
    14. Fabo, Brian & Jančoková, Martina & Kempf, Elisabeth & Pástor, Ľuboš, 2024. "Fifty shades of QE: Robust evidence," Journal of Banking & Finance, Elsevier, vol. 159(C).
    15. Stanley, T. D. & Doucouliagos, Chris, 2019. "Practical Significance, Meta-Analysis and the Credibility of Economics," IZA Discussion Papers 12458, Institute of Labor Economics (IZA).
    16. Amanda Kvarven & Eirik Strømland & Conny Wollbrant & David Andersson & Magnus Johannesson & Gustav Tinghög & Daniel Västfjäll & Kristian Ove R. Myrseth, 2020. "The intuitive cooperation hypothesis revisited: a meta-analytic examination of effect size and between-study heterogeneity," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 6(1), pages 26-42, June.
    17. Vigren, Andreas & Pyddoke, Roger, 2020. "The impact on bus ridership of passenger incentive contracts in public transport," Transportation Research Part A: Policy and Practice, Elsevier, vol. 135(C), pages 144-159.
    18. Jasper Brinkerink, 2023. "When Shooting for the Stars Becomes Aiming for Asterisks: P-Hacking in Family Business Research," Entrepreneurship Theory and Practice, , vol. 47(2), pages 304-343, March.
    19. Laura Hueber & Rene Schwaiger, 2021. "Debiasing Through Experience Sampling: The Case of Myopic Loss Aversion," Working Papers 2021-01, Faculty of Economics and Statistics, Universität Innsbruck.
    20. Brice Corgnet & Cary Deck & Mark DeSantis & Kyle Hampton & Erik O. Kimbrough, 2023. "When Do Security Markets Aggregate Dispersed Information?," Management Science, INFORMS, vol. 69(6), pages 3697-3729, June.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0236079. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.