IDEAS home Printed from https://ideas.repec.org/p/feb/natura/00732.html
   My bibliography  Save this paper

Multiple Testing with Covariate Adjustment in Experimental Economics

Author

Listed:
  • John List
  • Azeem Shaikh
  • Atom Vayalinkal

Abstract

List et al. (2019) provides a framework for testing multiple null hypotheses simultaneously using experimental data in which simple random sampling is used to assign treatment status to units. As in List et al. (2019), we rely on general results in Romano and Wolf (2010) to develop under weak assumptions a procedure that (i) asymptotically controls the familywise error rate - the probability of one or more false rejections - and (ii) is asymptotically balanced in that the marginal probability of rejecting any true null hypothesis is approximately equal in large samples. Our analysis departs from List et al. (2019) in that it further exploits observed, baseline covariates. The precise way in which these covariates are incorporated is based upon results in Lin (2013) in order to ensure that inferences are typically more powerful in large samples.

Suggested Citation

  • John List & Azeem Shaikh & Atom Vayalinkal, 2021. "Multiple Testing with Covariate Adjustment in Experimental Economics," Natural Field Experiments 00732, The Field Experiments Website.
  • Handle: RePEc:feb:natura:00732
    as

    Download full text from publisher

    File URL: http://s3.amazonaws.com/fieldexperiments-papers2/papers/00732.pdf
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. John A. List & Azeem M. Shaikh & Yang Xu, 2019. "Multiple hypothesis testing in experimental economics," Experimental Economics, Springer;Economic Science Association, vol. 22(4), pages 773-793, December.
    2. Joseph P. Romano & Michael Wolf, 2005. "Stepwise Multiple Testing as Formalized Data Snooping," Econometrica, Econometric Society, vol. 73(4), pages 1237-1282, July.
    3. Machado, Cecilia & Shaikh, Azeem M. & Vytlacil, Edward J., 2019. "Instrumental variables and the sign of the average treatment effect," Journal of Econometrics, Elsevier, vol. 212(2), pages 522-555.
    4. Dean Karlan & John A. List, 2007. "Does Price Matter in Charitable Giving? Evidence from a Large-Scale Natural Field Experiment," American Economic Review, American Economic Association, vol. 97(5), pages 1774-1793, December.
    5. Soohyung Lee & Azeem M. Shaikh, 2014. "Multiple Testing And Heterogeneous Treatment Effects: Re‐Evaluating The Effect Of Progresa On School Enrollment," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 29(4), pages 612-626, June.
    6. Joseph Romano & Azeem Shaikh & Michael Wolf, 2008. "Control of the false discovery rate under dependence using the bootstrap and subsampling," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 17(3), pages 417-442, November.
    7. Zacharias Maniadis & Fabio Tufano & John A. List, 2014. "One Swallow Doesn't Make a Summer: Reply to Kataria," Econ Journal Watch, Econ Journal Watch, vol. 11(1), pages 11-16, January.
    8. Joseph P. Romano & Azeem M. Shaikh & Michael Wolf, 2010. "Hypothesis Testing in Econometrics," Annual Review of Economics, Annual Reviews, vol. 2(1), pages 75-104, September.
    9. James Heckman & Seong Hyeok Moon & Rodrigo Pinto & Peter Savelyev & Adam Yavitz, 2010. "Analyzing social experiments as implemented: A reexamination of the evidence from the HighScope Perry Preschool Program," Quantitative Economics, Econometric Society, vol. 1(1), pages 1-46, July.
    10. Romano, Joseph P. & Shaikh, Azeem M. & Wolf, Michael, 2008. "Formalized Data Snooping Based On Generalized Error Rates," Econometric Theory, Cambridge University Press, vol. 24(2), pages 404-447, April.
    11. Joseph Romano & Azeem Shaikh & Michael Wolf, 2008. "Rejoinder on: Control of the false discovery rate under dependence using the bootstrap and subsampling," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 17(3), pages 461-471, November.
    12. Bhattacharya, Jay & Shaikh, Azeem M. & Vytlacil, Edward, 2012. "Treatment effect bounds: An application to Swan–Ganz catheterization," Journal of Econometrics, Elsevier, vol. 168(2), pages 223-243.
    13. Zacharias Maniadis & Fabio Tufano & John A. List, 2014. "One Swallow Doesn't Make a Summer: New Evidence on Anchoring Effects," American Economic Review, American Economic Association, vol. 104(1), pages 277-290, January.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. John A. List & Azeem M. Shaikh & Yang Xu, 2019. "Multiple hypothesis testing in experimental economics," Experimental Economics, Springer;Economic Science Association, vol. 22(4), pages 773-793, December.
    2. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    3. Bajgrowicz, Pierre & Scaillet, Olivier, 2012. "Technical trading revisited: False discoveries, persistence tests, and transaction costs," Journal of Financial Economics, Elsevier, vol. 106(3), pages 473-491.
    4. Matteo M. Galizzi & Daniel Navarro Martinez, 2015. "On the external validity of social-preference games: A systematic lab-field study," Economics Working Papers 1462, Department of Economics and Business, Universitat Pompeu Fabra.
    5. Zeng-Hua Lu, 2019. "Extended MinP Tests of Multiple Hypotheses," Papers 1911.04696, arXiv.org.
    6. Alexandre Belloni & Victor Chernozhukov & Denis Chetverikov & Christian Hansen & Kengo Kato, 2018. "High-dimensional econometrics and regularized GMM," CeMMAP working papers CWP35/18, Centre for Microdata Methods and Practice, Institute for Fiscal Studies.
    7. Omar Al-Ubaydli & John List & Claire Mackevicius & Min Sok Lee & Dana Suskind, 2019. "How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling," Artefactual Field Experiments 00679, The Field Experiments Website.
    8. Tova Levin & Steven Levitt & John List, 2015. "A Glimpse into the World of High Capacity Givers: Experimental Evidence from a University Capital Campaign," Natural Field Experiments 00409, The Field Experiments Website.
    9. Davide Viviano & Kaspar Wuthrich & Paul Niehaus, 2021. "(When) should you adjust inferences for multiple hypothesis testing?," Papers 2104.13367, arXiv.org.
    10. Brennan S Thompson & Matthew D Webb, 2019. "A simple, graphical approach to comparing multiple treatments," Econometrics Journal, Royal Economic Society, vol. 22(2), pages 188-205.
    11. Christophe Hurlin & Sébastien Laurent & Rogier Quaedvlieg & Stephan Smeekes, 2017. "Risk Measure Inference," Journal of Business & Economic Statistics, Taylor & Francis Journals, vol. 35(4), pages 499-512, October.
    12. Rodrigo Pinto & Azeem Shaikh & Adam Yavitz & James Heckman, 2010. "Inference with Imperfect Randomization: The Case of the Perry Preschool Program," 2010 Meeting Papers 1336, Society for Economic Dynamics.
    13. Michel André Maréchal & Christian Thöni, 2019. "Hidden Persuaders: Do Small Gifts Lubricate Business Negotiations?," Management Science, INFORMS, vol. 65(8), pages 3877-3888, August.
    14. Smeekes, S., 2011. "Bootstrap sequential tests to determine the stationary units in a panel," Research Memorandum 003, Maastricht University, Maastricht Research School of Economics of Technology and Organization (METEOR).
    15. Matteo M. Galizzi & Daniel Navarro-Martinez, 2019. "On the External Validity of Social Preference Games: A Systematic Lab-Field Study," Management Science, INFORMS, vol. 65(3), pages 976-1002, March.
    16. Christopher J. Bennett, 2009. "p-Value Adjustments for Asymptotic Control of the Generalized Familywise Error Rate," Vanderbilt University Department of Economics Working Papers 0905, Vanderbilt University Department of Economics.
    17. Giuseppe Cavaliere & Dimitris N. Politis & Anders Rahbek & Stephan Smeekes, 2015. "Recent developments in bootstrap methods for dependent data," Journal of Time Series Analysis, Wiley Blackwell, vol. 36(3), pages 398-415, May.
    18. Romano, Joseph P. & Wolf, Michael, 2016. "Efficient computation of adjusted p-values for resampling-based stepdown multiple testing," Statistics & Probability Letters, Elsevier, vol. 113(C), pages 38-40.
    19. Sandner, Malte & Cornelissen, Thomas & Jungmann, Tanja & Herrmann, Peggy, 2018. "Evaluating the effects of a targeted home visiting program on maternal and child health outcomes," Journal of Health Economics, Elsevier, vol. 58(C), pages 269-283.
    20. Roland G. Fryer, Jr. & Steven D. Levitt & John A. List, 2015. "Parental Incentives and Early Childhood Achievement: A Field Experiment in Chicago Heights," NBER Working Papers 21477, National Bureau of Economic Research, Inc.

    More about this item

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:feb:natura:00732. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: . General contact details of provider: http://www.fieldexperiments.com .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Joe Seidel The email address of this maintainer does not seem to be valid anymore. Please ask Joe Seidel to update the entry or send us the correct address (email available below). General contact details of provider: http://www.fieldexperiments.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.