IDEAS home Printed from https://ideas.repec.org/p/zur/econwp/219.html
   My bibliography  Save this paper

Efficient computation of adjusted p-values for resampling-based stepdown multiple testing

Author

Listed:
  • Joseph P. Romano
  • Michael Wolf

Abstract

There has been a recent interest in reporting p-values adjusted for resampling-based stepdown multiple testing procedures proposed in Romano and Wolf (2005a,b). The original papers only describe how to carry out multiple testing at a fixed significance level. Computing adjusted p-values instead in an efficient manner is not entirely trivial. Therefore, this paper fills an apparent gap by detailing such an algorithm.

Suggested Citation

  • Joseph P. Romano & Michael Wolf, 2016. "Efficient computation of adjusted p-values for resampling-based stepdown multiple testing," ECON - Working Papers 219, Department of Economics - University of Zurich.
  • Handle: RePEc:zur:econwp:219
    as

    Download full text from publisher

    File URL: https://www.zora.uzh.ch/id/eprint/123047/1/econwp219.pdf
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Joseph P. Romano & Michael Wolf, 2005. "Stepwise Multiple Testing as Formalized Data Snooping," Econometrica, Econometric Society, vol. 73(4), pages 1237-1282, July.
    2. James Heckman & Seong Hyeok Moon & Rodrigo Pinto & Peter Savelyev & Adam Yavitz, 2010. "Analyzing social experiments as implemented: A reexamination of the evidence from the HighScope Perry Preschool Program," Quantitative Economics, Econometric Society, vol. 1(1), pages 1-46, July.
    3. Romano, Joseph P. & Shaikh, Azeem M. & Wolf, Michael, 2008. "Formalized Data Snooping Based On Generalized Error Rates," Econometric Theory, Cambridge University Press, vol. 24(2), pages 404-447, April.
    4. Joseph P. Romano & Michael Wolf, 2005. "Exact and Approximate Stepdown Methods for Multiple Hypothesis Testing," Journal of the American Statistical Association, American Statistical Association, vol. 100, pages 94-108, March.
    5. James Heckman & Seong Hyeok Moon & Rodrigo Pinto & Peter Savelyev & Adam Yavitz, 2010. "Analyzing social experiments as implemented: evidence from the HighScope Perry Preschool Program," CeMMAP working papers CWP22/10, Centre for Microdata Methods and Practice, Institute for Fiscal Studies.
    6. Will Dobbie & Roland G. Fryer Jr., 2015. "The Medium-Term Impacts of High-Achieving Charter Schools," Journal of Political Economy, University of Chicago Press, vol. 123(5), pages 985-1037.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. John A. List & Azeem M. Shaikh & Yang Xu, 2019. "Multiple hypothesis testing in experimental economics," Experimental Economics, Springer;Economic Science Association, vol. 22(4), pages 773-793, December.
    2. John A. List & Azeem M. Shaikh & Atom Vayalinkal, 2023. "Multiple testing with covariate adjustment in experimental economics," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 38(6), pages 920-939, September.
    3. Sandner, Malte & Cornelissen, Thomas & Jungmann, Tanja & Herrmann, Peggy, 2018. "Evaluating the effects of a targeted home visiting program on maternal and child health outcomes," Journal of Health Economics, Elsevier, vol. 58(C), pages 269-283.
    4. Alison Andrew & Orazio Attanasio & Britta Augsburg & Lina Cardona-Sosa & Monimalika Day & Michele Giannola & Sally Grantham-McGregor & Pamela Jervis & Costas Meghir & Marta Rubio-Codina, 2024. "Early Childhood Intervention for the Poor: Long Term Outcomes," NBER Working Papers 32165, National Bureau of Economic Research, Inc.
    5. Orazio Attanasio & Sarah Cattan & Emla Fitzsimons & Costas Meghir & Marta Rubio-Codina, 2020. "Estimating the Production Function for Human Capital: Results from a Randomized Controlled Trial in Colombia," American Economic Review, American Economic Association, vol. 110(1), pages 48-85, January.
    6. Fabian Kosse & Thomas Deckers & Pia Pinger & Hannah Schildberg-Hörisch & Armin Falk, 2020. "The Formation of Prosociality: Causal Evidence on the Role of Social Environment," Journal of Political Economy, University of Chicago Press, vol. 128(2), pages 434-467.
    7. Orla Doyle & Nick Fitzpatrick & Judy Lovett & Caroline Rawdon, 2015. "Early intervention and child health: Evidence from a Dublin-based randomized controlled trial," Working Papers 201505, Geary Institute, University College Dublin.
    8. Berger, Eva M. & Fehr, Ernst & Hermes, Henning & Schunk, Daniel & Winkel, Kirsten, 2020. "The Impact of Working Memory Training on Children's Cognitive and Noncognitive Skills," IZA Discussion Papers 13338, Institute of Labor Economics (IZA).
    9. Doyle, Orla & Harmon, Colm & Heckman, James J. & Logue, Caitriona & Moon, Seong Hyeok, 2017. "Early skill formation and the efficiency of parental investment: A randomized controlled trial of home visiting," Labour Economics, Elsevier, vol. 45(C), pages 40-58.
    10. Orazio Attanasio & Sarah Cattan & Emla Fitzsimons & Costas Meghir & Marta Rubio-Codina, 2015. "Estimating the Production Function for Human Capital: Results from a Randomized Control Trial in Colombia," Cowles Foundation Discussion Papers 1987, Cowles Foundation for Research in Economics, Yale University.
    11. Doyle, O. & Harmon, C. & Heckman, J.J. & Logue, C,; & Moon, S.H., 2013. "Measuring Investment in Human Capital Formation: An Experimental Analysis of Early Life Outcomes," Health, Econometrics and Data Group (HEDG) Working Papers 13/18, HEDG, c/o Department of Economics, University of York.
    12. Gabriella Conti & Christopher Hansman & James J. Heckman & Matthew F. X. Novak & Angela Ruggiero & Stephen J. Suomi, 2012. "Primate Evidence on the Late Health Effects of Early Life Adversity," Working Papers 2012-008, Human Capital and Economic Opportunity Working Group.
    13. Dan Wunderli, 2012. "Controlling the danger of false discoveries in estimating multiple treatment effects," ECON - Working Papers 060, Department of Economics - University of Zurich.
    14. Pedro Carneiro & Oswald Koussihouèdé & Nathalie Lahire & Costas Meghir & Corina Mommaerts, 2020. "School Grants and Education Quality: Experimental Evidence from Senegal," Economica, London School of Economics and Political Science, vol. 87(345), pages 28-51, January.
    15. Orla Doyle, 2017. "The First 2,000 Days and Child Skills: Evidence from a Randomized Experiment of Home Visiting," Working Papers 2017-054, Human Capital and Economic Opportunity Working Group.
    16. Cortés, Darwin & Maldonado, Darío & Gallego, Juan & Charpak, Nathalie & Tessier, Rejean & Ruiz, Juan Gabriel & Hernandez, José Tiberio & Uriza, Felipe & Pico, Julieth, 2022. "Comparing long-term educational effects of two early childhood health interventions," Journal of Health Economics, Elsevier, vol. 86(C).
    17. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    18. Peter A. Savelyev & Kegon T. K. Tan, 2019. "Socioemotional Skills, Education, and Health-Related Outcomes of High-Ability Individuals," American Journal of Health Economics, MIT Press, vol. 5(2), pages 250-280, Spring.
    19. Orla Doyle, 2020. "Can Early Intervention have a Sustained Effect on Human Capital?," Working Papers 202001, Geary Institute, University College Dublin.
    20. Hideo Akabayashi & TIm Ruberg & Chizuru Shikishima & Jun Yamashita, 2023. "Education-Oriented and Care-Oriented Preschools:Implications on Child Development," Keio-IES Discussion Paper Series 2023-009, Institute for Economics Studies, Keio University.

    More about this item

    Keywords

    Adjusted p-values; multiple testing; resampling; stepdown procedure;
    All these keywords.

    JEL classification:

    • C12 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Hypothesis Testing: General

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:zur:econwp:219. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Severin Oswald (email available below). General contact details of provider: https://edirc.repec.org/data/seizhch.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.