IDEAS home Printed from https://ideas.repec.org/p/rye/wpaper/wp063.html

A Simple, Graphical Approach to Comparing Multiple Treatments

Author

Listed:
  • Brennan S. Thompson

    (Department of Economics, Ryerson University)

  • Matthew D. Webb

    (Department of Economics, Carleton University)

Abstract

We propose a graphical approach to comparing multiple treatments that allows users to easily infer differences between any treatment effect and zero, and between any pair of treatment effects. Our approach makes use of a flexible, resampling-based procedure that asymptotically controls the familywise error rate (the probability of making one or more spurious inferences). We demonstrate the usefulness of our approach with three empirical examples.

Suggested Citation

  • Brennan S. Thompson & Matthew D. Webb, 2015. "A Simple, Graphical Approach to Comparing Multiple Treatments," Working Papers 063, Toronto Metropolitan University, Department of Economics, revised Mar 2017.
  • Handle: RePEc:rye:wpaper:wp063
    as

    Download full text from publisher

    File URL: https://www.arts.ryerson.ca/economics/repec/pdfs/wp063.pdf
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Anderson, Michael L., 2008. "Multiple Inference and Gender Differences in the Effects of Early Intervention: A Reevaluation of the Abecedarian, Perry Preschool, and Early Training Projects," Journal of the American Statistical Association, American Statistical Association, vol. 103(484), pages 1481-1495.
    2. John A. List & Azeem M. Shaikh & Yang Xu, 2019. "Multiple hypothesis testing in experimental economics," Experimental Economics, Springer;Economic Science Association, vol. 22(4), pages 773-793, December.
    3. Christopher J. Bennett & Brennan S. Thompson, 2016. "Graphical Procedures for Multiple Comparisons Under General Dependence," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 111(515), pages 1278-1288, July.
    4. William C. Horrace & Peter Schmidt, 2000. "Multiple comparisons with the best, with economic applications," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 15(1), pages 1-26.
    5. Soohyung Lee & Azeem M. Shaikh, 2014. "Multiple Testing And Heterogeneous Treatment Effects: Re‐Evaluating The Effect Of Progresa On School Enrollment," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 29(4), pages 612-626, June.
    6. Dean Karlan & John A. List, 2007. "Does Price Matter in Charitable Giving? Evidence from a Large-Scale Natural Field Experiment," American Economic Review, American Economic Association, vol. 97(5), pages 1774-1793, December.
    7. Philip Oreopoulos & Daniel Lang & Joshua Angrist, 2009. "Incentives and Services for College Achievement: Evidence from a Randomized Trial," American Economic Journal: Applied Economics, American Economic Association, vol. 1(1), pages 136-163, January.
    8. G�nther Fink & Margaret McConnell & Sebastian Vollmer, 2014. "Testing for heterogeneous treatment effects in experimental data: false discovery risks and correction procedures," Journal of Development Effectiveness, Taylor & Francis Journals, vol. 6(1), pages 44-57, January.
    9. Steven F. Lehrer & R. Vincent Pohl & Kyungchul Song, 2022. "Multiple Testing and the Distributional Effects of Accountability Incentives in Education," Journal of Business & Economic Statistics, Taylor & Francis Journals, vol. 40(4), pages 1552-1568, October.
    10. White, Halbert, 1980. "A Heteroskedasticity-Consistent Covariance Matrix Estimator and a Direct Test for Heteroskedasticity," Econometrica, Econometric Society, vol. 48(4), pages 817-838, May.
    11. Karthik Muralidharan & Venkatesh Sundararaman, 2011. "Teacher Performance Pay: Experimental Evidence from India," Journal of Political Economy, University of Chicago Press, vol. 119(1), pages 39-77.
    12. MacKinnon, James G. & White, Halbert, 1985. "Some heteroskedasticity-consistent covariance matrix estimators with improved finite sample properties," Journal of Econometrics, Elsevier, vol. 29(3), pages 305-325, September.
    13. Davidson, Russell & MacKinnon, James G., 2010. "Wild Bootstrap Tests for IV Regression," Journal of Business & Economic Statistics, American Statistical Association, vol. 28(1), pages 128-144.
    14. Alwyn Young, 2019. "Channeling Fisher: Randomization Tests and the Statistical Insignificance of Seemingly Significant Experimental Results," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 134(2), pages 557-598.
    15. Joseph P. Romano & Michael Wolf, 2005. "Exact and Approximate Stepdown Methods for Multiple Hypothesis Testing," Journal of the American Statistical Association, American Statistical Association, vol. 100, pages 94-108, March.
    16. Jiaying Gu & Shu Shen, 2018. "Oracle and adaptive false discovery rate controlling methods for one‐sided testing: theory and application in treatment effect evaluation," Econometrics Journal, Royal Economic Society, vol. 21(1), pages 11-35, February.
    17. Joseph P. Romano & Michael Wolf, 2005. "Stepwise Multiple Testing as Formalized Data Snooping," Econometrica, Econometric Society, vol. 73(4), pages 1237-1282, July.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. John A. List & Azeem M. Shaikh & Yang Xu, 2019. "Multiple hypothesis testing in experimental economics," Experimental Economics, Springer;Economic Science Association, vol. 22(4), pages 773-793, December.
    2. Young, Alwyn, 2019. "Channeling Fisher: randomization tests and the statistical insignificance of seemingly significant experimental results," LSE Research Online Documents on Economics 101401, London School of Economics and Political Science, LSE Library.
    3. Jeffrey D. Michler & Anna Josephson, 2022. "Recent developments in inference: practicalities for applied economics," Chapters, in: A Modern Guide to Food Economics, chapter 11, pages 235-268, Edward Elgar Publishing.
    4. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    5. Steven F. Lehrer & R. Vincent Pohl & Kyungchul Song, 2016. "Targeting Policies: Multiple Testing and Distributional Treatment Effects," NBER Working Papers 22950, National Bureau of Economic Research, Inc.
    6. Davide Viviano & Kaspar Wuthrich & Paul Niehaus, 2021. "A model of multiple hypothesis testing," Papers 2104.13367, arXiv.org, revised Jan 2025.
    7. Azevedo E Castro De Cardim,Joana & Amaro Da Costa Luz Carneiro,Pedro Manuel & Carvalho,Leandro S. & De Walque,Damien B. C. M., 2022. "Early Education, Preferences, and Decision-Making Abilities," Policy Research Working Paper Series 10187, The World Bank.
    8. John A. List & Azeem M. Shaikh & Atom Vayalinkal, 2023. "Multiple testing with covariate adjustment in experimental economics," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 38(6), pages 920-939, September.
    9. Francesco Agostinelli & Ciro Avitabile & Matteo Bobba, 2025. "Enhancing Human Capital in Children: A Case Study on Scaling," Journal of Political Economy, University of Chicago Press, vol. 133(2), pages 455-491.
    10. Steven F. Lehrer & R. Vincent Pohl & Kyungchul Song, 2022. "Multiple Testing and the Distributional Effects of Accountability Incentives in Education," Journal of Business & Economic Statistics, Taylor & Francis Journals, vol. 40(4), pages 1552-1568, October.
    11. Belot, Michèle & James, Jonathan & Spiteri, Jonathan, 2020. "Facilitating healthy dietary habits: An experiment with a low income population," European Economic Review, Elsevier, vol. 129(C).
    12. Islam, Asad & Kwon, Sungoh & Masood, Eema & Prakash, Nishith & Sabarwal, Shwetlena & Saraswat, Deepak, 2020. "When Goal-Setting Forges Ahead but Stops Short," GLO Discussion Paper Series 526, Global Labor Organization (GLO).
    13. Haoge Chang & Joel Middleton & P. M. Aronow, 2021. "Exact Bias Correction for Linear Adjustment of Randomized Controlled Trials," Papers 2110.08425, arXiv.org, revised Oct 2021.
    14. Lubega, Patrick & Nakakawa, Frances & Narciso, Gaia & Newman, Carol & Kaaya, Archileo N. & Kityo, Cissy & Tumuhimbise, Gaston A., 2021. "Body and mind: Experimental evidence from women living with HIV," Journal of Development Economics, Elsevier, vol. 150(C).
    15. Zachary Breig & Matthew Gibson & Jeffrey Shrader, 2019. "Why Do We Procrastinate? Present Bias and Optimism," Department of Economics Working Papers 2019-15, Department of Economics, Williams College.
    16. Chowdhury, Shyamal & Hasan, Syed & Sharma, Uttam, 2024. "The Role of Trainee Selection in the Effectiveness of Vocational Training: Evidence from a Randomized Controlled Trial in Nepal," IZA Discussion Papers 16705, Institute of Labor Economics (IZA).
    17. Hermes, Henning & Krauß, Marina & Lergetporer, Philipp & Peter, Frauke & Wiederhold, Simon, 2022. "Early Child Care and Labor Supply of Lower-SES Mothers: A Randomized Controlled Trial," IZA Discussion Papers 15814, Institute of Labor Economics (IZA).
    18. Cygan-Rehm, Kamila & Karbownik, Krzysztof, 2022. "The effects of incentivizing early prenatal care on infant health," Journal of Health Economics, Elsevier, vol. 83(C).
    19. Hermes, Henning & Lergetporer, Philipp & Peter, Frauke & Wiederhold, Simon, 2021. "Behavioral Barriers and the Socioeconomic Gap in Child Care Enrollment," IZA Discussion Papers 14698, Institute of Labor Economics (IZA).
    20. Chung, EunYi & Olivares, Mauricio, 2021. "Permutation test for heterogeneous treatment effects with a nuisance parameter," Journal of Econometrics, Elsevier, vol. 225(2), pages 148-174.

    More about this item

    Keywords

    ;
    ;
    ;

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:rye:wpaper:wp063. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Doosoo Kim (email available below). General contact details of provider: https://edirc.repec.org/data/deryeca.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.