IDEAS home Printed from https://ideas.repec.org/a/oup/biomet/v93y2006i3p509-524.html
   My bibliography  Save this article

False discovery control with p-value weighting

Author

Listed:
  • Christopher R. Genovese
  • Kathryn Roeder
  • Larry Wasserman

Abstract

We present a method for multiple hypothesis testing that maintains control of the false discovery rate while incorporating prior information about the hypotheses. The prior information takes the form of p-value weights. If the assignment of weights is positively associated with the null hypotheses being false, the procedure improves power, except in cases where power is already near one. Even if the assignment of weights is poor, power is only reduced slightly, as long as the weights are not too large. We also provide a similar method for controlling false discovery exceedance. Copyright 2006, Oxford University Press.

Suggested Citation

  • Christopher R. Genovese & Kathryn Roeder & Larry Wasserman, 2006. "False discovery control with p-value weighting," Biometrika, Biometrika Trust, vol. 93(3), pages 509-524, September.
  • Handle: RePEc:oup:biomet:v:93:y:2006:i:3:p:509-524
    as

    Download full text from publisher

    File URL: http://hdl.handle.net/10.1093/biomet/93.3.509
    Download Restriction: Access to full text is restricted to subscribers.
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Tommaso Proietti, 2016. "On the Selection of Common Factors for Macroeconomic Forecasting," Advances in Econometrics, in: Dynamic Factor Models, volume 35, pages 593-628, Emerald Group Publishing Limited.
    2. Djalel-Eddine Meskaldji & Dimitri Van De Ville & Jean-Philippe Thiran & Stephan Morgenthaler, 2020. "A comprehensive error rate for multiple testing," Statistical Papers, Springer, vol. 61(5), pages 1859-1874, October.
    3. Otília Menyhart & Boglárka Weltz & Balázs Győrffy, 2021. "MultipleTesting.com: A tool for life science researchers for multiple hypothesis testing correction," PLOS ONE, Public Library of Science, vol. 16(6), pages 1-12, June.
    4. Kang Guolian & Ye Keying & Liu Nianjun & Allison David B. & Gao Guimin, 2009. "Weighted Multiple Hypothesis Testing Procedures," Statistical Applications in Genetics and Molecular Biology, De Gruyter, vol. 8(1), pages 1-22, April.
    5. Yoav Benjamini & Ruth Heller & Abba Krieger & Saharon Rosset, 2023. "Discussion on “Optimal test procedures for multiple hypotheses controlling the familywise expected loss” by Willi Maurer, Frank Bretz, and Xiaolei Xun," Biometrics, The International Biometric Society, vol. 79(4), pages 2794-2797, December.
    6. A. Farcomeni & L. Finos, 2013. "FDR Control with Pseudo-Gatekeeping Based on a Possibly Data Driven Order of the Hypotheses," Biometrics, The International Biometric Society, vol. 69(3), pages 606-613, September.
    7. Lin, Wan-Yu & Lee, Wen-Chung, 2011. "Floating prioritized subset analysis: A powerful method to detect differentially expressed genes," Computational Statistics & Data Analysis, Elsevier, vol. 55(1), pages 903-913, January.
    8. Wenguang Sun & T. Tony Cai, 2009. "Large‐scale multiple testing under dependence," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 71(2), pages 393-424, April.
    9. He, Li & Sarkar, Sanat K. & Zhao, Zhigen, 2015. "Capturing the severity of type II errors in high-dimensional multiple testing," Journal of Multivariate Analysis, Elsevier, vol. 142(C), pages 106-116.
    10. Nikolaos Ignatiadis & Wolfgang Huber, 2021. "Covariate powered cross‐weighted multiple testing," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 83(4), pages 720-751, September.
    11. Habiger, Joshua D. & Peña, Edsel A., 2014. "Compound p-value statistics for multiple testing procedures," Journal of Multivariate Analysis, Elsevier, vol. 126(C), pages 153-166.
    12. Andrew Y. Chen, 2022. "Most claimed statistical findings in cross-sectional return predictability are likely true," Papers 2206.15365, arXiv.org, revised Mar 2024.
    13. Ang Li & Rina Foygel Barber, 2017. "Accumulation Tests for FDR Control in Ordered Hypothesis Testing," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 112(518), pages 837-849, April.
    14. Michael A. Langston & Robert S. Levine & Barbara J. Kilbourne & Gary L. Rogers & Anne D. Kershenbaum & Suzanne H. Baktash & Steven S. Coughlin & Arnold M. Saxton & Vincent K. Agboto & Darryl B. Hood &, 2014. "Scalable Combinatorial Tools for Health Disparities Research," IJERPH, MDPI, vol. 11(10), pages 1-25, October.
    15. Edsel Peña & Joshua Habiger & Wensong Wu, 2015. "Classes of multiple decision functions strongly controlling FWER and FDR," Metrika: International Journal for Theoretical and Applied Statistics, Springer, vol. 78(5), pages 563-595, July.
    16. Zhao, Haibing & Fung, Wing Kam, 2016. "A powerful FDR control procedure for multiple hypotheses," Computational Statistics & Data Analysis, Elsevier, vol. 98(C), pages 60-70.
    17. Andrew Y. Chen & Tom Zimmermann, 2022. "Publication Bias in Asset Pricing Research," Papers 2209.13623, arXiv.org, revised Sep 2023.
    18. Zhao, Haibing, 2014. "Adaptive FWER control procedure for grouped hypotheses," Statistics & Probability Letters, Elsevier, vol. 95(C), pages 63-70.
    19. Haibing Zhao & Wing Kam Fung, 2018. "Controlling mixed directional false discovery rate in multidimensional decisions with applications to microarray studies," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 27(2), pages 316-337, June.
    20. Anat Reiner-Benaim, 2016. "Scan Statistic Tail Probability Assessment Based on Process Covariance and Window Size," Methodology and Computing in Applied Probability, Springer, vol. 18(3), pages 717-745, September.
    21. Yoav Benjamini, 2010. "Discovering the false discovery rate," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 72(4), pages 405-416, September.
    22. Clara Bicalho & Adam Bouyamourn & Thad Dunning, 2022. "Conditional Balance Tests: Increasing Sensitivity and Specificity With Prognostic Covariates," Papers 2205.10478, arXiv.org.
    23. Remo Monti & Pia Rautenstrauch & Mahsa Ghanbari & Alva Rani James & Matthias Kirchler & Uwe Ohler & Stefan Konigorski & Christoph Lippert, 2022. "Identifying interpretable gene-biomarker associations with functionally informed kernel-based tests in 190,000 exomes," Nature Communications, Nature, vol. 13(1), pages 1-16, December.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:oup:biomet:v:93:y:2006:i:3:p:509-524. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Oxford University Press (email available below). General contact details of provider: https://academic.oup.com/biomet .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.