IDEAS home Printed from https://ideas.repec.org/p/feb/natura/00732.html
   My bibliography  Save this paper

Multiple Testing with Covariate Adjustment in Experimental Economics

Author

Listed:
  • John List
  • Azeem Shaikh
  • Atom Vayalinkal

Abstract

List et al. (2019) provides a framework for testing multiple null hypotheses simultaneously using experimental data in which simple random sampling is used to assign treatment status to units. As in List et al. (2019), we rely on general results in Romano and Wolf (2010) to develop under weak assumptions a procedure that (i) asymptotically controls the familywise error rate - the probability of one or more false rejections - and (ii) is asymptotically balanced in that the marginal probability of rejecting any true null hypothesis is approximately equal in large samples. Our analysis departs from List et al. (2019) in that it further exploits observed, baseline covariates. The precise way in which these covariates are incorporated is based upon results in Lin (2013) in order to ensure that inferences are typically more powerful in large samples.

Suggested Citation

  • John List & Azeem Shaikh & Atom Vayalinkal, 2021. "Multiple Testing with Covariate Adjustment in Experimental Economics," Natural Field Experiments 00732, The Field Experiments Website.
  • Handle: RePEc:feb:natura:00732
    as

    Download full text from publisher

    File URL: http://s3.amazonaws.com/fieldexperiments-papers2/papers/00732.pdf
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. John A. List & Azeem M. Shaikh & Yang Xu, 2019. "Multiple hypothesis testing in experimental economics," Experimental Economics, Springer;Economic Science Association, vol. 22(4), pages 773-793, December.
    2. Joseph P. Romano & Michael Wolf, 2005. "Stepwise Multiple Testing as Formalized Data Snooping," Econometrica, Econometric Society, vol. 73(4), pages 1237-1282, July.
    3. James Heckman & Seong Hyeok Moon & Rodrigo Pinto & Peter Savelyev & Adam Yavitz, 2010. "Analyzing social experiments as implemented: evidence from the HighScope Perry Preschool Program," CeMMAP working papers CWP22/10, Centre for Microdata Methods and Practice, Institute for Fiscal Studies.
    4. Luigi Butera & Philip Grossman & Daniel Houser & John List & Marie-Claire Villeval, 2020. "A New Mechanism to Alleviate the Crises of Confidence in Science - With an Application to the Public Goods Game," Artefactual Field Experiments 00684, The Field Experiments Website.
    5. Machado, Cecilia & Shaikh, Azeem M. & Vytlacil, Edward J., 2019. "Instrumental variables and the sign of the average treatment effect," Journal of Econometrics, Elsevier, vol. 212(2), pages 522-555.
    6. Rodrigo Pinto & Azeem Shaikh & Adam Yavitz & James Heckman, 2010. "Inference with Imperfect Randomization: The Case of the Perry Preschool Program," 2010 Meeting Papers 1336, Society for Economic Dynamics.
    7. Dean Karlan & John A. List, 2007. "Does Price Matter in Charitable Giving? Evidence from a Large-Scale Natural Field Experiment," American Economic Review, American Economic Association, vol. 97(5), pages 1774-1793, December.
    8. Soohyung Lee & Azeem M. Shaikh, 2014. "Multiple Testing And Heterogeneous Treatment Effects: Re‐Evaluating The Effect Of Progresa On School Enrollment," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 29(4), pages 612-626, June.
    9. Joseph Romano & Azeem Shaikh & Michael Wolf, 2008. "Control of the false discovery rate under dependence using the bootstrap and subsampling," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 17(3), pages 417-442, November.
    10. Zacharias Maniadis & Fabio Tufano & John A. List, 2014. "One Swallow Doesn't Make a Summer: Reply to Kataria," Econ Journal Watch, Econ Journal Watch, vol. 11(1), pages 11-16, January.
    11. Joseph P. Romano & Azeem M. Shaikh & Michael Wolf, 2010. "Hypothesis Testing in Econometrics," Annual Review of Economics, Annual Reviews, vol. 2(1), pages 75-104, September.
    12. James Heckman & Seong Hyeok Moon & Rodrigo Pinto & Peter Savelyev & Adam Yavitz, 2010. "Analyzing social experiments as implemented: A reexamination of the evidence from the HighScope Perry Preschool Program," Quantitative Economics, Econometric Society, vol. 1(1), pages 1-46, July.
    13. Romano, Joseph P. & Shaikh, Azeem M. & Wolf, Michael, 2008. "Formalized Data Snooping Based On Generalized Error Rates," Econometric Theory, Cambridge University Press, vol. 24(2), pages 404-447, April.
    14. Luigi Butera & Philip Grossman & Daniel Houser & John List & Marie Villeval, 2020. "A New Mechanism to Alleviate the Crises of Confidence in Science With An Application to the Public Goods GameA Review," Working Papers halshs-02512932, HAL.
    15. Joseph Romano & Azeem Shaikh & Michael Wolf, 2008. "Rejoinder on: Control of the false discovery rate under dependence using the bootstrap and subsampling," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 17(3), pages 461-471, November.
    16. Bhattacharya, Jay & Shaikh, Azeem M. & Vytlacil, Edward, 2012. "Treatment effect bounds: An application to Swan–Ganz catheterization," Journal of Econometrics, Elsevier, vol. 168(2), pages 223-243.
    17. Zacharias Maniadis & Fabio Tufano & John A. List, 2014. "One Swallow Doesn't Make a Summer: New Evidence on Anchoring Effects," American Economic Review, American Economic Association, vol. 104(1), pages 277-290, January.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. John A. List & Azeem M. Shaikh & Yang Xu, 2019. "Multiple hypothesis testing in experimental economics," Experimental Economics, Springer;Economic Science Association, vol. 22(4), pages 773-793, December.
    2. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    3. Jeffrey D. Michler & Anna Josephson, 2022. "Recent developments in inference: practicalities for applied economics," Chapters, in: A Modern Guide to Food Economics, chapter 11, pages 235-268, Edward Elgar Publishing.
    4. Arouna, Aminou & Michler, Jeffrey D. & Lokossou, Jourdain C., 2021. "Contract farming and rural transformation: Evidence from a field experiment in Benin," Journal of Development Economics, Elsevier, vol. 151(C).
    5. Alexandre Belloni & Victor Chernozhukov & Denis Chetverikov & Christian Hansen & Kengo Kato, 2018. "High-dimensional econometrics and regularized GMM," CeMMAP working papers CWP35/18, Centre for Microdata Methods and Practice, Institute for Fiscal Studies.
    6. Bajgrowicz, Pierre & Scaillet, Olivier, 2012. "Technical trading revisited: False discoveries, persistence tests, and transaction costs," Journal of Financial Economics, Elsevier, vol. 106(3), pages 473-491.
    7. Matteo M. Galizzi & Daniel Navarro-Martinez, 2019. "On the External Validity of Social Preference Games: A Systematic Lab-Field Study," Management Science, INFORMS, vol. 65(3), pages 976-1002, March.
    8. Brennan S Thompson & Matthew D Webb, 2019. "A simple, graphical approach to comparing multiple treatments," Econometrics Journal, Royal Economic Society, vol. 22(2), pages 188-205.
    9. Zeng-Hua Lu, 2019. "Extended MinP Tests of Multiple Hypotheses," Papers 1911.04696, arXiv.org.
    10. Romano, Joseph P. & Wolf, Michael, 2016. "Efficient computation of adjusted p-values for resampling-based stepdown multiple testing," Statistics & Probability Letters, Elsevier, vol. 113(C), pages 38-40.
    11. Sandner, Malte & Cornelissen, Thomas & Jungmann, Tanja & Herrmann, Peggy, 2018. "Evaluating the effects of a targeted home visiting program on maternal and child health outcomes," Journal of Health Economics, Elsevier, vol. 58(C), pages 269-283.
    12. Ganesh Karapakula & James J. Heckman, 2020. "Using a Satisficing Model of Experimenter Decision-Making to Guide Finite-Sample Inference for Compromised Experiments," Working Papers 2020-063, Human Capital and Economic Opportunity Working Group.
    13. Orazio Attanasio & Sarah Cattan & Emla Fitzsimons & Costas Meghir & Marta Rubio-Codina, 2020. "Estimating the Production Function for Human Capital: Results from a Randomized Controlled Trial in Colombia," American Economic Review, American Economic Association, vol. 110(1), pages 48-85, January.
    14. Omar Al-Ubaydli & John List & Claire Mackevicius & Min Sok Lee & Dana Suskind, 2019. "How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling," Artefactual Field Experiments 00679, The Field Experiments Website.
    15. Doyle, Orla & Harmon, Colm & Heckman, James J. & Logue, Caitriona & Moon, Seong Hyeok, 2017. "Early skill formation and the efficiency of parental investment: A randomized controlled trial of home visiting," Labour Economics, Elsevier, vol. 45(C), pages 40-58.
    16. Attanasio, Orazio & Cattan, Sarah & Fitzsimons, Emla & Meghir, Costas & Rubio-Codina, Marta, 2015. "Estimating the Production Function for Human Capital: Results from a Randomized Control Trial in Colombia," IZA Discussion Papers 8856, Institute of Labor Economics (IZA).
    17. Becker William & Paruolo Paolo & Saltelli Andrea, 2021. "Variable Selection in Regression Models Using Global Sensitivity Analysis," Journal of Time Series Econometrics, De Gruyter, vol. 13(2), pages 187-233, July.
    18. Doyle, O. & Harmon, C. & Heckman, J.J. & Logue, C,; & Moon, S.H., 2013. "Measuring Investment in Human Capital Formation: An Experimental Analysis of Early Life Outcomes," Health, Econometrics and Data Group (HEDG) Working Papers 13/18, HEDG, c/o Department of Economics, University of York.
    19. Tova Levin & Steven Levitt & John List, 2015. "A Glimpse into the World of High Capacity Givers: Experimental Evidence from a University Capital Campaign," Natural Field Experiments 00409, The Field Experiments Website.
    20. Dan Wunderli, 2012. "Controlling the danger of false discoveries in estimating multiple treatment effects," ECON - Working Papers 060, Department of Economics - University of Zurich.

    More about this item

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:feb:natura:00732. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: . General contact details of provider: http://www.fieldexperiments.com .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Joe Seidel The email address of this maintainer does not seem to be valid anymore. Please ask Joe Seidel to update the entry or send us the correct address (email available below). General contact details of provider: http://www.fieldexperiments.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.