IDEAS home Printed from https://ideas.repec.org/p/feb/artefa/00402.html
   My bibliography  Save this paper

Multiple Hypothesis Testing in Experimental Economics

Author

Listed:
  • John List
  • Azeem Shaikh
  • Yang Xu

Abstract

Empiricism in the sciences allows us to test theories, formulate optimal policies, and learn how the world works. In this manner, it is critical that our empirical work provides accurate conclusions about underlying data patterns. False positives represent an especially important problem, as vast public and private resources can be misguided if we base decisions on false discovery. This study explores one especially pernicious influence on false positives-multiple hypothesis testing (MHT). While MHT potentially affects all types of empirical work, we consider three common scenarios where MHT influences inference within experimental economics: jointly identifying treatment effects for a set of outcomes, estimating heterogenous treatment effects through subgroup analysis, and conducting hypothesis testing for multiple treatment conditions. Building upon the work of Romano and Wolf (2010), we present a correction procedure that incorporates the three scenarios, and illustrate the improvement in power by comparing our results with those obtained by the classic studies due to Bonferroni (1935) and Holm (1979). Importantly, under weak assumptions, our testing procedure asymptotically controls the familywise error rate - the probability of one false rejection - and is asymptotically balanced. We showcase our approach by revisiting the data reported in Karlan and List (2007), to deepen our understanding of why people give to charitable causes.

Suggested Citation

  • John List & Azeem Shaikh & Yang Xu, 2016. "Multiple Hypothesis Testing in Experimental Economics," Artefactual Field Experiments 00402, The Field Experiments Website.
  • Handle: RePEc:feb:artefa:00402
    as

    Download full text from publisher

    File URL: http://s3.amazonaws.com/fieldexperiments-papers2/papers/00402.pdf
    Download Restriction: no

    Other versions of this item:

    References listed on IDEAS

    as
    1. James Heckman & Seong Hyeok Moon & Rodrigo Pinto & Peter Savelyev & Adam Yavitz, 2010. "Analyzing social experiments as implemented: A reexamination of the evidence from the HighScope Perry Preschool Program," Quantitative Economics, Econometric Society, vol. 1(1), pages 1-46, July.
    2. Joseph P. Romano & Michael Wolf, 2005. "Stepwise Multiple Testing as Formalized Data Snooping," Econometrica, Econometric Society, vol. 73(4), pages 1237-1282, July.
    3. Jeffrey A. Flory & Andreas Leibbrandt & John A. List, 2015. "Do Competitive Workplaces Deter Female Workers? A Large-Scale Natural Field Experiment on Job Entry Decisions," Review of Economic Studies, Oxford University Press, vol. 82(1), pages 122-155.
    4. James Heckman & Seong Hyeok Moon & Rodrigo Pinto & Peter Savelyev & Adam Yavitz, 2010. "Analyzing social experiments as implemented: evidence from the HighScope Perry Preschool Program," CeMMAP working papers CWP22/10, Centre for Microdata Methods and Practice, Institute for Fiscal Studies.
    5. Federico A. Bugni & Ivan A. Canay & Azeem M. Shaikh, 2015. "Inference under covariate-adaptive randomization," CeMMAP working papers CWP45/15, Centre for Microdata Methods and Practice, Institute for Fiscal Studies.
    6. Uri Gneezy & Muriel Niederle & Aldo Rustichini, 2003. "Performance in Competitive Environments: Gender Differences," The Quarterly Journal of Economics, Oxford University Press, vol. 118(3), pages 1049-1074.
    7. Katherine Casey & Rachel Glennerster & Edward Miguel, 2012. "Reshaping Institutions: Evidence on Aid Impacts Using a Preanalysis Plan," The Quarterly Journal of Economics, Oxford University Press, vol. 127(4), pages 1755-1812.
    8. Dean Karlan & John A. List, 2007. "Does Price Matter in Charitable Giving? Evidence from a Large-Scale Natural Field Experiment," American Economic Review, American Economic Association, vol. 97(5), pages 1774-1793, December.
    9. Soohyung Lee & Azeem M. Shaikh, 2014. "Multiple Testing And Heterogeneous Treatment Effects: Re‐Evaluating The Effect Of Progresa On School Enrollment," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 29(4), pages 612-626, June.
    10. Rodrigo Pinto & Azeem Shaikh & Adam Yavitz & James Heckman, 2010. "Inference with Imperfect Randomization: The Case of the Perry Preschool Program," 2010 Meeting Papers 1336, Society for Economic Dynamics.
    11. Günther Fink & Margaret McConnell & Sebastian Vollmer, 2014. "Testing for heterogeneous treatment effects in experimental data: false discovery risks and correction procedures," Journal of Development Effectiveness, Taylor & Francis Journals, vol. 6(1), pages 44-57, January.
    12. Joseph P. Romano & Michael Wolf, 2005. "Exact and Approximate Stepdown Methods for Multiple Hypothesis Testing," Journal of the American Statistical Association, American Statistical Association, vol. 100, pages 94-108, March.
    13. Muriel Niederle & Lise Vesterlund, 2007. "Do Women Shy Away From Competition? Do Men Compete Too Much?," The Quarterly Journal of Economics, Oxford University Press, vol. 122(3), pages 1067-1101.
    14. Bhattacharya, Jay & Shaikh, Azeem M. & Vytlacil, Edward, 2012. "Treatment effect bounds: An application to Swan–Ganz catheterization," Journal of Econometrics, Elsevier, vol. 168(2), pages 223-243.
    15. Jeffrey R Kling & Jeffrey B Liebman & Lawrence F Katz, 2007. "Experimental Analysis of Neighborhood Effects," Econometrica, Econometric Society, vol. 75(1), pages 83-119, January.
    16. Zacharias Maniadis & Fabio Tufano & John A. List, 2014. "One Swallow Doesn't Make a Summer: New Evidence on Anchoring Effects," American Economic Review, American Economic Association, vol. 104(1), pages 277-290, January.
    17. Anderson, Michael L., 2008. "Multiple Inference and Gender Differences in the Effects of Early Intervention: A Reevaluation of the Abecedarian, Perry Preschool, and Early Training Projects," Journal of the American Statistical Association, American Statistical Association, vol. 103(484), pages 1481-1495.
    Full references (including those not matched with items on IDEAS)

    More about this item

    JEL classification:

    • C1 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General
    • C9 - Mathematical and Quantitative Methods - - Design of Experiments
    • C91 - Mathematical and Quantitative Methods - - Design of Experiments - - - Laboratory, Individual Behavior
    • C92 - Mathematical and Quantitative Methods - - Design of Experiments - - - Laboratory, Group Behavior
    • C93 - Mathematical and Quantitative Methods - - Design of Experiments - - - Field Experiments

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:feb:artefa:00402. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Joe Seidel). General contact details of provider: http://www.fieldexperiments.com .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.