IDEAS home Printed from https://ideas.repec.org/a/kap/expeco/v22y2019i4d10.1007_s10683-018-09597-5.html
   My bibliography  Save this article

Multiple hypothesis testing in experimental economics

Author

Listed:
  • John A. List

    () (University of Chicago)

  • Azeem M. Shaikh

    () (University of Chicago)

  • Yang Xu

    () (University of Chicago)

Abstract

Abstract The analysis of data from experiments in economics routinely involves testing multiple null hypotheses simultaneously. These different null hypotheses arise naturally in this setting for at least three different reasons: when there are multiple outcomes of interest and it is desired to determine on which of these outcomes a treatment has an effect; when the effect of a treatment may be heterogeneous in that it varies across subgroups defined by observed characteristics and it is desired to determine for which of these subgroups a treatment has an effect; and finally when there are multiple treatments of interest and it is desired to determine which treatments have an effect relative to either the control or relative to each of the other treatments. In this paper, we provide a bootstrap-based procedure for testing these null hypotheses simultaneously using experimental data in which simple random sampling is used to assign treatment status to units. Using the general results in Romano and Wolf (Ann Stat 38:598–633, 2010), we show under weak assumptions that our procedure (1) asymptotically controls the familywise error rate—the probability of one or more false rejections—and (2) is asymptotically balanced in that the marginal probability of rejecting any true null hypothesis is approximately equal in large samples. Importantly, by incorporating information about dependence ignored in classical multiple testing procedures, such as the Bonferroni and Holm corrections, our procedure has much greater ability to detect truly false null hypotheses. In the presence of multiple treatments, we additionally show how to exploit logical restrictions across null hypotheses to further improve power. We illustrate our methodology by revisiting the study by Karlan and List (Am Econ Rev 97(5):1774–1793, 2007) of why people give to charitable causes.

Suggested Citation

  • John A. List & Azeem M. Shaikh & Yang Xu, 2019. "Multiple hypothesis testing in experimental economics," Experimental Economics, Springer;Economic Science Association, vol. 22(4), pages 773-793, December.
  • Handle: RePEc:kap:expeco:v:22:y:2019:i:4:d:10.1007_s10683-018-09597-5
    DOI: 10.1007/s10683-018-09597-5
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10683-018-09597-5
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    As the access to this document is restricted, you may want to look for a different version below or search for a different version of it.

    Other versions of this item:

    References listed on IDEAS

    as
    1. Joseph P. Romano & Michael Wolf, 2005. "Stepwise Multiple Testing as Formalized Data Snooping," Econometrica, Econometric Society, vol. 73(4), pages 1237-1282, July.
    2. James Heckman & Seong Hyeok Moon & Rodrigo Pinto & Peter Savelyev & Adam Yavitz, 2010. "Analyzing social experiments as implemented: evidence from the HighScope Perry Preschool Program," CeMMAP working papers CWP22/10, Centre for Microdata Methods and Practice, Institute for Fiscal Studies.
    3. Federico A. Bugni & Ivan A. Canay & Azeem M. Shaikh, 2015. "Inference under covariate-adaptive randomization," CeMMAP working papers CWP45/15, Centre for Microdata Methods and Practice, Institute for Fiscal Studies.
    4. Katherine Casey & Rachel Glennerster & Edward Miguel, 2012. "Reshaping Institutions: Evidence on Aid Impacts Using a Preanalysis Plan," The Quarterly Journal of Economics, Oxford University Press, vol. 127(4), pages 1755-1812.
    5. Dean Karlan & John A. List, 2007. "Does Price Matter in Charitable Giving? Evidence from a Large-Scale Natural Field Experiment," American Economic Review, American Economic Association, vol. 97(5), pages 1774-1793, December.
    6. Soohyung Lee & Azeem M. Shaikh, 2014. "Multiple Testing And Heterogeneous Treatment Effects: Re‐Evaluating The Effect Of Progresa On School Enrollment," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 29(4), pages 612-626, June.
    7. Rodrigo Pinto & Azeem Shaikh & Adam Yavitz & James Heckman, 2010. "Inference with Imperfect Randomization: The Case of the Perry Preschool Program," 2010 Meeting Papers 1336, Society for Economic Dynamics.
    8. Joseph P. Romano & Michael Wolf, 2005. "Exact and Approximate Stepdown Methods for Multiple Hypothesis Testing," Journal of the American Statistical Association, American Statistical Association, vol. 100, pages 94-108, March.
    9. Muriel Niederle & Lise Vesterlund, 2007. "Do Women Shy Away From Competition? Do Men Compete Too Much?," The Quarterly Journal of Economics, Oxford University Press, vol. 122(3), pages 1067-1101.
    10. Jeffrey R Kling & Jeffrey B Liebman & Lawrence F Katz, 2007. "Experimental Analysis of Neighborhood Effects," Econometrica, Econometric Society, vol. 75(1), pages 83-119, January.
    11. Anderson, Michael L., 2008. "Multiple Inference and Gender Differences in the Effects of Early Intervention: A Reevaluation of the Abecedarian, Perry Preschool, and Early Training Projects," Journal of the American Statistical Association, American Statistical Association, vol. 103(484), pages 1481-1495.
    12. James Heckman & Seong Hyeok Moon & Rodrigo Pinto & Peter Savelyev & Adam Yavitz, 2010. "Analyzing social experiments as implemented: A reexamination of the evidence from the HighScope Perry Preschool Program," Quantitative Economics, Econometric Society, vol. 1(1), pages 1-46, July.
    13. Günther Fink & Margaret McConnell & Sebastian Vollmer, 2014. "Testing for heterogeneous treatment effects in experimental data: false discovery risks and correction procedures," Journal of Development Effectiveness, Taylor & Francis Journals, vol. 6(1), pages 44-57, January.
    14. Jeffrey A. Flory & Andreas Leibbrandt & John A. List, 2015. "Do Competitive Workplaces Deter Female Workers? A Large-Scale Natural Field Experiment on Job Entry Decisions," Review of Economic Studies, Oxford University Press, vol. 82(1), pages 122-155.
    15. Uri Gneezy & Muriel Niederle & Aldo Rustichini, 2003. "Performance in Competitive Environments: Gender Differences," The Quarterly Journal of Economics, Oxford University Press, vol. 118(3), pages 1049-1074.
    16. Bhattacharya, Jay & Shaikh, Azeem M. & Vytlacil, Edward, 2012. "Treatment effect bounds: An application to Swan–Ganz catheterization," Journal of Econometrics, Elsevier, vol. 168(2), pages 223-243.
    17. Zacharias Maniadis & Fabio Tufano & John A. List, 2014. "One Swallow Doesn't Make a Summer: New Evidence on Anchoring Effects," American Economic Review, American Economic Association, vol. 104(1), pages 277-290, January.
    Full references (including those not matched with items on IDEAS)

    More about this item

    Keywords

    Experiments; Multiple hypothesis testing; Multiple treatments; Multiple outcomes; Multiple subgroups; Randomized controlled trial; Bootstrap; Balance;

    JEL classification:

    • C12 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Hypothesis Testing: General
    • C14 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Semiparametric and Nonparametric Methods: General

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:kap:expeco:v:22:y:2019:i:4:d:10.1007_s10683-018-09597-5. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Sonal Shukla) or (Springer Nature Abstracting and Indexing). General contact details of provider: http://www.springer.com .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.