IDEAS home Printed from https://ideas.repec.org/p/osf/osfxxx/wr93f.html
   My bibliography  Save this paper

Z-Curve

Author

Listed:
  • Schimmack, Ulrich
  • Brunner, Jerry

Abstract

In recent years, the replicability of original findings published in psychology journals has been questioned. A key concern is that selection for significance inflates observed effect sizes and observed power. If selection bias is severe, replication studies are unlikely to reproduce a significant result. We introduce z-curve as a new method that can estimate the average true power for sets of studies that are selected for significance. We compare this method with p-curve, which has been used for the same purpose. Simulation studies show that both methods perform well when all studies have the same power, but p-curve overestimates power if power varies across studies. Based on these findings, we recommend z-curve to estimate power for sets of studies that are heterogeneous and selected for significance. Application of z-curve to various datasets suggests that the average replicability of published results in psychology is approximately 50%, but there is substantial heterogeneity and many psychological studies remain underpowered and are likely to produce false negative results. To increase replicability and credibility of published results it is important to reduce selection bias and to increase statistical power.

Suggested Citation

  • Schimmack, Ulrich & Brunner, Jerry, 2017. "Z-Curve," OSF Preprints wr93f, Center for Open Science.
  • Handle: RePEc:osf:osfxxx:wr93f
    DOI: 10.31219/osf.io/wr93f
    as

    Download full text from publisher

    File URL: https://osf.io/download/5a0e461cb83f69027512c849/
    Download Restriction: no

    File URL: https://libkey.io/10.31219/osf.io/wr93f?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Hoenig J. M. & Heisey D. M., 2001. "The Abuse of Power: The Pervasive Fallacy of Power Calculations for Data Analysis," The American Statistician, American Statistical Association, vol. 55, pages 19-24, February.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Tom Coupé & W. Robert Reed, 2021. "Do Negative Replications Affect Citations?," Working Papers in Economics 21/14, University of Canterbury, Department of Economics and Finance.
    2. Weili Ding, 2020. "Laboratory experiments can pre-design to address power and selection issues," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 6(2), pages 125-138, December.
    3. Jiarui Tian, 2021. "A Replication of “The effect of the conservation reserve program on rural economies: Deriving a statistical verdict from a null finding” (American Journal of Agricultural Economics, 2019)," Working Papers in Economics 21/12, University of Canterbury, Department of Economics and Finance.
    4. Jason W. Beckstead, 2007. "A note on determining the number of cues used in judgment analysis studies: The issue of type II error," Judgment and Decision Making, Society for Judgment and Decision Making, vol. 2, pages 317-325, October.
    5. Kimball Chapman & Michael Drake & Joseph H. Schroeder & Timothy Seidel, 2023. "Earnings announcement delays and implications for the auditor-client relationship," Review of Accounting Studies, Springer, vol. 28(1), pages 45-90, March.
    6. Jeffrey C. Valentine & Therese D. Pigott & Hannah R. Rothstein, 2010. "How Many Studies Do You Need?," Journal of Educational and Behavioral Statistics, , vol. 35(2), pages 215-247, April.
    7. Nancy Elizabeth Doyle & Almuth McDowall & Raymond Randall & Kate Knight, 2022. "Does it work? Using a Meta-Impact score to examine global effects in quasi-experimental intervention studies," PLOS ONE, Public Library of Science, vol. 17(3), pages 1-21, March.
    8. van Koten, Silvester, 2021. "The forward premium in electricity markets: An experimental study," Energy Economics, Elsevier, vol. 94(C).
    9. Markku Maula & Wouter Stam, 2020. "Enhancing Rigor in Quantitative Entrepreneurship Research," Entrepreneurship Theory and Practice, , vol. 44(6), pages 1059-1090, November.
    10. Irina Surdu & Kamel Mellahi & Keith Glaister, 2017. "Once bitten, not necessarily shy? Organisational learning prior experience effects on foreign market re-entry commitment decisions," John H Dunning Centre for International Business Discussion Papers jhd-dp2017-04, Henley Business School, University of Reading.
    11. Kathryn N. Vasilaky & J. Michelle Brock, 2020. "Power(ful) guidelines for experimental economists," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 6(2), pages 189-212, December.
    12. Tian Jiarui (Alex), 2023. "A Replication of “The Effect of the Conservation Reserve Program on Rural Economies: Deriving a Statistical Verdict from a Null Finding” (American Journal of Agricultural Economics, 2019)," Economics - The Open-Access, Open-Assessment Journal, De Gruyter, vol. 17(1), pages 1-7, January.
    13. repec:cup:judgdm:v:2:y:2007:i::p:317-325 is not listed on IDEAS
    14. Irina Surdu & Kamel Mellahi & Keith W Glaister, 2019. "Once bitten, not necessarily shy? Determinants of foreign market re-entry commitment strategies," Journal of International Business Studies, Palgrave Macmillan;Academy of International Business, vol. 50(3), pages 393-422, April.
    15. Cleary, Rebecca & Liu, Yizao & Carlson, Andrea C., 2022. "Differences in the Distribution of Nutrition Between Households Above and Below Poverty," 2022 Annual Meeting, July 31-August 2, Anaheim, California 322267, Agricultural and Applied Economics Association.
    16. Michal Ovádek, 2019. "The apolitical lawyer: experimental evidence of a framing effect," European Journal of Law and Economics, Springer, vol. 48(3), pages 385-415, December.
    17. Johanna Catherine Maclean & John Buckell, 2021. "Information and sin goods: Experimental evidence on cigarettes," Health Economics, John Wiley & Sons, Ltd., vol. 30(2), pages 289-310, February.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:osf:osfxxx:wr93f. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: OSF (email available below). General contact details of provider: https://osf.io/preprints/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.