IDEAS home Printed from https://ideas.repec.org/a/wly/japmet/v38y2023i6p920-939.html
   My bibliography  Save this article

Multiple testing with covariate adjustment in experimental economics

Author

Listed:
  • John A. List
  • Azeem M. Shaikh
  • Atom Vayalinkal

Abstract

This paper provides a framework for testing multiple null hypotheses simultaneously using experimental data in which simple random sampling is used to assign treatment status to units. Using general results from the multiple testing literature, we develop under weak assumptions a procedure that (i) asymptotically controls the familywise error rate—the probability of one or more false rejections—and (ii) is asymptotically balanced in that the marginal probability of rejecting any true null hypothesis is approximately equal in large samples. Our procedure improves upon classical methods by incorporating information about the joint dependence structure of the test statistics when determining which null hypotheses to reject, leading to gains in power. An important point of departure from prior work is that we exploit observed, baseline covariates to obtain further gains in power. The precise way in which we incorporate these covariates is based on recent results from the statistics literature in order to ensure that inferences are typically more powerful in large samples.

Suggested Citation

  • John A. List & Azeem M. Shaikh & Atom Vayalinkal, 2023. "Multiple testing with covariate adjustment in experimental economics," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 38(6), pages 920-939, September.
  • Handle: RePEc:wly:japmet:v:38:y:2023:i:6:p:920-939
    DOI: 10.1002/jae.2985
    as

    Download full text from publisher

    File URL: https://doi.org/10.1002/jae.2985
    Download Restriction: no

    File URL: https://libkey.io/10.1002/jae.2985?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Joseph P. Romano & Michael Wolf, 2005. "Stepwise Multiple Testing as Formalized Data Snooping," Econometrica, Econometric Society, vol. 73(4), pages 1237-1282, July.
    2. Federico A. Bugni & Ivan A. Canay & Azeem M. Shaikh, 2018. "Inference Under Covariate-Adaptive Randomization," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 113(524), pages 1784-1796, October.
    3. Yang L. & Tsiatis A. A., 2001. "Efficiency Study of Estimators for a Treatment Effect in a Pretest-Posttest Trial," The American Statistician, American Statistical Association, vol. 55, pages 314-321, November.
    4. Rodrigo Pinto & Azeem Shaikh & Adam Yavitz & James Heckman, 2010. "Inference with Imperfect Randomization: The Case of the Perry Preschool Program," 2010 Meeting Papers 1336, Society for Economic Dynamics.
    5. Dean Karlan & John A. List, 2007. "Does Price Matter in Charitable Giving? Evidence from a Large-Scale Natural Field Experiment," American Economic Review, American Economic Association, vol. 97(5), pages 1774-1793, December.
    6. Luigi Butera & Philip Grossman & Daniel Houser & John List & Marie-Claire Villeval, 2020. "A New Mechanism to Alleviate the Crises of Confidence in Science - With an Application to the Public Goods Game," Artefactual Field Experiments 00684, The Field Experiments Website.
    7. Joseph Romano & Azeem Shaikh & Michael Wolf, 2008. "Control of the false discovery rate under dependence using the bootstrap and subsampling," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 17(3), pages 417-442, November.
    8. Zacharias Maniadis & Fabio Tufano & John A. List, 2014. "One Swallow Doesn't Make a Summer: Reply to Kataria," Econ Journal Watch, Econ Journal Watch, vol. 11(1), pages 11-16, January.
    9. James Heckman & Seong Hyeok Moon & Rodrigo Pinto & Peter Savelyev & Adam Yavitz, 2010. "Analyzing social experiments as implemented: A reexamination of the evidence from the HighScope Perry Preschool Program," Quantitative Economics, Econometric Society, vol. 1(1), pages 1-46, July.
    10. James Heckman & Rodrigo Pinto & Peter Savelyev, 2013. "Understanding the Mechanisms through Which an Influential Early Childhood Program Boosted Adult Outcomes," American Economic Review, American Economic Association, vol. 103(6), pages 2052-2086, October.
    11. Romano, Joseph P. & Shaikh, Azeem M. & Wolf, Michael, 2008. "Formalized Data Snooping Based On Generalized Error Rates," Econometric Theory, Cambridge University Press, vol. 24(2), pages 404-447, April.
    12. Federico A. Bugni & Ivan A. Canay & Azeem M. Shaikh, 2019. "Inference under covariate‐adaptive randomization with multiple treatments," Quantitative Economics, Econometric Society, vol. 10(4), pages 1747-1785, November.
    13. Joseph Romano & Azeem Shaikh & Michael Wolf, 2008. "Rejoinder on: Control of the false discovery rate under dependence using the bootstrap and subsampling," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 17(3), pages 461-471, November.
    14. John A. List & Azeem M. Shaikh & Yang Xu, 2019. "Multiple hypothesis testing in experimental economics," Experimental Economics, Springer;Economic Science Association, vol. 22(4), pages 773-793, December.
    15. Bhattacharya, Jay & Shaikh, Azeem M. & Vytlacil, Edward, 2012. "Treatment effect bounds: An application to Swan–Ganz catheterization," Journal of Econometrics, Elsevier, vol. 168(2), pages 223-243.
    16. Zacharias Maniadis & Fabio Tufano & John A. List, 2014. "One Swallow Doesn't Make a Summer: New Evidence on Anchoring Effects," American Economic Review, American Economic Association, vol. 104(1), pages 277-290, January.
    17. James Heckman & Seong Hyeok Moon & Rodrigo Pinto & Peter Savelyev & Adam Yavitz, 2010. "Analyzing social experiments as implemented: evidence from the HighScope Perry Preschool Program," CeMMAP working papers CWP22/10, Centre for Microdata Methods and Practice, Institute for Fiscal Studies.
    18. Machado, Cecilia & Shaikh, Azeem M. & Vytlacil, Edward J., 2019. "Instrumental variables and the sign of the average treatment effect," Journal of Econometrics, Elsevier, vol. 212(2), pages 522-555.
    19. Soohyung Lee & Azeem M. Shaikh, 2014. "Multiple Testing And Heterogeneous Treatment Effects: Re‐Evaluating The Effect Of Progresa On School Enrollment," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 29(4), pages 612-626, June.
    20. Joseph P. Romano & Azeem M. Shaikh & Michael Wolf, 2010. "Hypothesis Testing in Econometrics," Annual Review of Economics, Annual Reviews, vol. 2(1), pages 75-104, September.
    21. Luigi Butera & Philip J Grossman & Daniel Houser & John A List & Marie Claire Villeval, 2020. "A New Mechanism to Alleviate the Crises of Confidence in Science With An Application to the Public Goods GameA Review," Working Papers halshs-02512932, HAL.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Chowdhury, Shyamal & Hasan, Syed & Sharma, Uttam, 2024. "The Role of Trainee Selection in the Effectiveness of Vocational Training: Evidence from a Randomized Controlled Trial in Nepal," IZA Discussion Papers 16705, Institute of Labor Economics (IZA).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. John A. List & Azeem M. Shaikh & Yang Xu, 2019. "Multiple hypothesis testing in experimental economics," Experimental Economics, Springer;Economic Science Association, vol. 22(4), pages 773-793, December.
    2. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    3. Jeffrey D. Michler & Anna Josephson, 2022. "Recent developments in inference: practicalities for applied economics," Chapters, in: A Modern Guide to Food Economics, chapter 11, pages 235-268, Edward Elgar Publishing.
    4. Arouna, Aminou & Michler, Jeffrey D. & Lokossou, Jourdain C., 2021. "Contract farming and rural transformation: Evidence from a field experiment in Benin," Journal of Development Economics, Elsevier, vol. 151(C).
    5. Alexandre Belloni & Victor Chernozhukov & Denis Chetverikov & Christian Hansen & Kengo Kato, 2018. "High-dimensional econometrics and regularized GMM," CeMMAP working papers CWP35/18, Centre for Microdata Methods and Practice, Institute for Fiscal Studies.
    6. Orazio Attanasio & Sarah Cattan & Emla Fitzsimons & Costas Meghir & Marta Rubio-Codina, 2020. "Estimating the Production Function for Human Capital: Results from a Randomized Controlled Trial in Colombia," American Economic Review, American Economic Association, vol. 110(1), pages 48-85, January.
    7. Bajgrowicz, Pierre & Scaillet, Olivier, 2012. "Technical trading revisited: False discoveries, persistence tests, and transaction costs," Journal of Financial Economics, Elsevier, vol. 106(3), pages 473-491.
    8. Berger, Eva M. & Fehr, Ernst & Hermes, Henning & Schunk, Daniel & Winkel, Kirsten, 2020. "The Impact of Working Memory Training on Children's Cognitive and Noncognitive Skills," IZA Discussion Papers 13338, Institute of Labor Economics (IZA).
    9. Doyle, Orla & Harmon, Colm & Heckman, James J. & Logue, Caitriona & Moon, Seong Hyeok, 2017. "Early skill formation and the efficiency of parental investment: A randomized controlled trial of home visiting," Labour Economics, Elsevier, vol. 45(C), pages 40-58.
    10. Orazio Attanasio & Sarah Cattan & Emla Fitzsimons & Costas Meghir & Marta Rubio-Codina, 2015. "Estimating the Production Function for Human Capital: Results from a Randomized Control Trial in Colombia," Cowles Foundation Discussion Papers 1987, Cowles Foundation for Research in Economics, Yale University.
    11. Matteo M. Galizzi & Daniel Navarro-Martinez, 2019. "On the External Validity of Social Preference Games: A Systematic Lab-Field Study," Management Science, INFORMS, vol. 65(3), pages 976-1002, March.
    12. Cortés, Darwin & Maldonado, Darío & Gallego, Juan & Charpak, Nathalie & Tessier, Rejean & Ruiz, Juan Gabriel & Hernandez, José Tiberio & Uriza, Felipe & Pico, Julieth, 2022. "Comparing long-term educational effects of two early childhood health interventions," Journal of Health Economics, Elsevier, vol. 86(C).
    13. Brennan S Thompson & Matthew D Webb, 2019. "A simple, graphical approach to comparing multiple treatments," The Econometrics Journal, Royal Economic Society, vol. 22(2), pages 188-205.
    14. Chung, EunYi & Olivares, Mauricio, 2021. "Permutation test for heterogeneous treatment effects with a nuisance parameter," Journal of Econometrics, Elsevier, vol. 225(2), pages 148-174.
    15. Yuehao Bai & Joseph P. Romano & Azeem M. Shaikh, 2022. "Inference in Experiments With Matched Pairs," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 117(540), pages 1726-1737, October.
    16. James J. Heckman & Rodrigo Pinto & Azeem Shaikh, 2023. "Dealing with Imperfect Randomization: Inference for the HighScope Perry Preschool Program," Working Papers 2023-031, Human Capital and Economic Opportunity Working Group.
    17. Hideo Akabayashi & TIm Ruberg & Chizuru Shikishima & Jun Yamashita, 2023. "Education-Oriented and Care-Oriented Preschools:Implications on Child Development," Keio-IES Discussion Paper Series 2023-009, Institute for Economics Studies, Keio University.
    18. Rodrigo Pinto & Azeem Shaikh & Adam Yavitz & James Heckman, 2010. "Inference with Imperfect Randomization: The Case of the Perry Preschool Program," 2010 Meeting Papers 1336, Society for Economic Dynamics.
    19. Zeng-Hua Lu, 2019. "Extended MinP Tests of Multiple Hypotheses," Papers 1911.04696, arXiv.org.
    20. James J Heckman & Ganesh Karapakula, 2021. "Using a satisficing model of experimenter decision-making to guide finite-sample inference for compromised experiments," The Econometrics Journal, Royal Economic Society, vol. 24(2), pages 1-39.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:wly:japmet:v:38:y:2023:i:6:p:920-939. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Wiley Content Delivery (email available below). General contact details of provider: http://www.interscience.wiley.com/jpages/0883-7252/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.