IDEAS home Printed from https://ideas.repec.org/p/nbr/nberwo/27609.html
   My bibliography  Save this paper

At What Level Should One Cluster Standard Errors in Paired Experiments, and in Stratified Experiments with Small Strata?

Author

Listed:
  • Clément de Chaisemartin
  • Jaime Ramirez-Cuellar

Abstract

In paired experiments, units are matched into pairs, and one unit of each pair is randomly assigned to treatment. To estimate the treatment effect, researchers often regress their outcome on a treatment indicator and pair fixed effects, clustering standard errors at the unit-of-randomization level. We show that the variance estimator in this regression may be severely downward biased: under constant treatment effect, its expectation equals 1/2 of the true variance. Instead, we show that researchers should cluster their standard errors at the pair level. Using simulations, we show that those results extend to stratified experiments with few units per strata.

Suggested Citation

  • Clément de Chaisemartin & Jaime Ramirez-Cuellar, 2020. "At What Level Should One Cluster Standard Errors in Paired Experiments, and in Stratified Experiments with Small Strata?," NBER Working Papers 27609, National Bureau of Economic Research, Inc.
  • Handle: RePEc:nbr:nberwo:27609
    Note: DEV ED HE LS TWP
    as

    Download full text from publisher

    File URL: http://www.nber.org/papers/w27609.pdf
    Download Restriction: Access to the full text is generally limited to series subscribers, however if the top level domain of the client browser is in a developing country or transition economy free access is provided. More information about subscriptions and free access is available at http://www.nber.org/wwphelp.html. Free access is also available to older working papers.
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Alberto Abadie & Susan Athey & Guido Imbens & Jeffrey Wooldridge, 2017. "When Should You Adjust Standard Errors for Clustering?," Papers 1710.02926, arXiv.org, revised Oct 2017.
    2. Glewwe, Paul & Park, Albert & Zhao, Meng, 2016. "A better vision for development: Eyeglasses and academic performance in rural primary schools in China," Journal of Development Economics, Elsevier, vol. 122(C), pages 170-182.
    3. Manuela Angelucci & Dean Karlan & Jonathan Zinman, 2015. "Microcredit Impacts: Evidence from a Randomized Microcredit Program Placement Experiment by Compartamos Banco," American Economic Journal: Applied Economics, American Economic Association, vol. 7(1), pages 151-182, January.
    4. Roland G. Fryer, Jr, 2017. "Management and Student Achievement: Evidence from a Randomized Field Experiment," NBER Working Papers 23437, National Bureau of Economic Research, Inc.
    5. Kate Ambler & Diego Aycinena & Dean Yang, 2015. "Channeling Remittances to Education: A Field Experiment among Migrants from El Salvador," American Economic Journal: Applied Economics, American Economic Association, vol. 7(2), pages 207-232, April.
    6. A. Colin Cameron & Douglas L. Miller, 2015. "A Practitioner’s Guide to Cluster-Robust Inference," Journal of Human Resources, University of Wisconsin Press, vol. 50(2), pages 317-372.
    7. Vincent Somville & Lore Vandewalle, 2018. "Saving by Default: Evidence from a Field Experiment in Rural India," American Economic Journal: Applied Economics, American Economic Association, vol. 10(3), pages 39-66, July.
    8. Miriam Bruhn & David McKenzie, 2009. "In Pursuit of Balance: Randomization in Practice in Development Field Experiments," American Economic Journal: Applied Economics, American Economic Association, vol. 1(4), pages 200-232, October.
    9. Rukmini Banerji & James Berry & Marc Shotland, 2017. "The Impact of Maternal Literacy and Participation Programs: Evidence from a Randomized Evaluation in India," American Economic Journal: Applied Economics, American Economic Association, vol. 9(4), pages 303-337, October.
    10. Imbens,Guido W. & Rubin,Donald B., 2015. "Causal Inference for Statistics, Social, and Biomedical Sciences," Cambridge Books, Cambridge University Press, number 9780521885881, December.
    11. Azeem M. Shaikh, 2019. "Inference in Experiments with Matched Pairs," CeMMAP working papers CWP19/19, Centre for Microdata Methods and Practice, Institute for Fiscal Studies.
    12. Jeanne Lafortune & Julio Riutort & José Tessada, 2018. "Role Models or Individual Consulting: The Impact of Personalizing Micro-entrepreneurship Training," American Economic Journal: Applied Economics, American Economic Association, vol. 10(4), pages 222-245, October.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. James G. MacKinnon & Morten Ørregaard Nielsen & Matthew D. Webb, 2020. "Testing for the appropriate level of clustering in linear regression models," Working Paper 1428, Economics Department, Queen's University.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Pedro Carneiro & Sokbae Lee & Daniel Wilhelm, 2020. "Optimal data collection for randomized control trials [Microcredit impacts: Evidence from a randomized microcredit program placement experiment by Compartamos Banco]," Econometrics Journal, Royal Economic Society, vol. 23(1), pages 1-31.
    2. Hirschauer, Norbert & Grüner, Sven & Mußhoff, Oliver & Becker, Claudia & Jantsch, Antje, 2020. "Can p-values be meaningfully interpreted without random sampling?," EconStor Open Access Articles and Book Chapters, ZBW - Leibniz Information Centre for Economics, pages 71-91.
    3. Derksen, Laura & Leclerc, Catherine Michaud & Souza, Pedro CL, 2019. "Searching for Answers : The Impact of Student Access to Wikipedia," The Warwick Economics Research Paper Series (TWERPS) 1236, University of Warwick, Department of Economics.
    4. Morgan, Stephen N. & Mason, Nicole M. & Maredia, Mywish K., 2020. "Lead-farmer extension and smallholder valuation of new agricultural technologies in Tanzania," Food Policy, Elsevier, vol. 97(C).
    5. Liang Jiang & Xiaobin Liu & Peter C. B. Phillips & Yichong Zhang, 2020. "Bootstrap Inference for Quantile Treatment Effects in Randomized Experiments with Matched Pairs," Papers 2005.11967, arXiv.org, revised May 2021.
    6. Simon Heß, 2017. "Randomization inference with Stata: A guide and software," Stata Journal, StataCorp LP, vol. 17(3), pages 630-651, September.
    7. Derksen, Laura & Leclerc, Catherine Michaud & Souza, Pedro CL, 2019. "Searching for Answers: The Impact of Student Access to Wikipedia," CAGE Online Working Paper Series 450, Competitive Advantage in the Global Economy (CAGE).
    8. Michael Pollmann, 2020. "Causal Inference for Spatial Treatments," Papers 2011.00373, arXiv.org.
    9. Alberto Abadie & Susan Athey & Guido W. Imbens & Jeffrey M. Wooldridge, 2020. "Sampling‐Based versus Design‐Based Uncertainty in Regression Analysis," Econometrica, Econometric Society, vol. 88(1), pages 265-296, January.
    10. Andrés Elberg & Pedro M. Gardete & Rosario Macera & Carlos Noton, 2019. "Dynamic effects of price promotions: field evidence, consumer search, and supply-side implications," Quantitative Marketing and Economics (QME), Springer, vol. 17(1), pages 1-58, March.
    11. Suresh de Mel & David McKenzie & Christopher Woodruff, 2019. "Labor Drops: Experimental Evidence on the Return to Additional Labor in Microenterprises," American Economic Journal: Applied Economics, American Economic Association, vol. 11(1), pages 202-235, January.
    12. Beaman, Lori & Karlan, Dean S. & Thuysbaert, Bram, 2014. "Saving for a (not so) Rainy Day: A Randomized Evaluation of Savings Groups in Mali," Center Discussion Papers 187189, Yale University, Economic Growth Center.
    13. Aufenanger, Tobias, 2017. "Machine learning to improve experimental design," FAU Discussion Papers in Economics 16/2017, Friedrich-Alexander University Erlangen-Nuremberg, Institute for Economics.
    14. Patrizia Lattarulo & Marco Mariani & Laura Razzolini, 2017. "Nudging museums attendance: a field experiment with high school teens," Journal of Cultural Economics, Springer;The Association for Cultural Economics International, vol. 41(3), pages 259-277, August.
    15. Benjamin L. Collier & Andrew F. Haughwout & Howard C. Kunreuther & Erwann O. Michel‐Kerjan, 2020. "Firms’ Management of Infrequent Shocks," Journal of Money, Credit and Banking, Blackwell Publishing, vol. 52(6), pages 1329-1359, September.
    16. Cipullo, Davide & Reslow, André, 2019. "Biased Forecasts to Affect Voting Decisions? The Brexit Case," Working Paper Series 2019:4, Uppsala University, Department of Economics.
    17. Sebastian Calonico & Matias D. Cattaneo & Max H. Farrell & Rocío Titiunik, 2019. "Regression Discontinuity Designs Using Covariates," The Review of Economics and Statistics, MIT Press, vol. 101(3), pages 442-451, July.
    18. Samuel Nocito & Marcello Sartarelli & Francesco Sobbrio, 2021. "A Beam of Light: Media, Tourism & Economic Development," CESifo Working Paper Series 9055, CESifo.
    19. Sven Resnjanskij & Jens Ruhose & Simon Wiederhold & Ludger Woessmann, 2021. "Can Mentoring Alleviate Family Disadvantage in Adolscence? A Field Experiment to Improve Labor-Market Prospects," CESifo Working Paper Series 8870, CESifo.
    20. Martinez, Isabel Z., 2016. "Beggar-Thy-Neighbour Tax Cuts: Mobility after a Local Income and Wealth Tax Reform in Switzerland," VfS Annual Conference 2016 (Augsburg): Demographic Change 145643, Verein für Socialpolitik / German Economic Association.

    More about this item

    JEL classification:

    • C01 - Mathematical and Quantitative Methods - - General - - - Econometrics
    • C21 - Mathematical and Quantitative Methods - - Single Equation Models; Single Variables - - - Cross-Sectional Models; Spatial Models; Treatment Effect Models

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nbr:nberwo:27609. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: . General contact details of provider: https://edirc.repec.org/data/nberrus.html .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (email available below). General contact details of provider: https://edirc.repec.org/data/nberrus.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.