IDEAS home Printed from
   My bibliography  Save this article

Rerandomization to Balance Tiers of Covariates


  • Kari Lock Morgan
  • Donald B. Rubin


When conducting a randomized experiment, if an allocation yields treatment groups that differ meaningfully with respect to relevant covariates, groups should be rerandomized. The process involves specifying an explicit criterion for whether an allocation is acceptable, based on a measure of covariate balance, and rerandomizing units until an acceptable allocation is obtained. Here, we illustrate how rerandomization could have improved the design of an already conducted randomized experiment on vocabulary and mathematics training programs, then provide a rerandomization procedure for covariates that vary in importance, and finally offer other extensions for rerandomization, including methods addressing computational efficiency. When covariates vary in a priori importance, better balance should be required for more important covariates. Rerandomization based on Mahalanobis distance preserves the joint distribution of covariates, but balances all covariates equally. Here, we propose rerandomizing based on Mahalanobis distance within tiers of covariate importance. Because balancing covariates in one tier will in general also partially balance covariates in other tiers, for each subsequent tier we explicitly balance only the components orthogonal to covariates in more important tiers.

Suggested Citation

  • Kari Lock Morgan & Donald B. Rubin, 2015. "Rerandomization to Balance Tiers of Covariates," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 110(512), pages 1412-1421, December.
  • Handle: RePEc:taf:jnlasa:v:110:y:2015:i:512:p:1412-1421
    DOI: 10.1080/01621459.2015.1079528

    Download full text from publisher

    File URL:
    Download Restriction: Access to full text is restricted to subscribers.

    File URL:
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    1. Shadish, William R. & Clark, M. H. & Steiner, Peter M., 2008. "Can Nonrandomized Experiments Yield Accurate Answers? A Randomized Experiment Comparing Random and Nonrandom Assignments," Journal of the American Statistical Association, American Statistical Association, vol. 103(484), pages 1334-1344.
    2. Rubin, Donald B., 2008. "Comment: The Design and Analysis of Gold Standard Randomized Experiments," Journal of the American Statistical Association, American Statistical Association, vol. 103(484), pages 1350-1353.
    3. D. R. Cox, 2009. "Randomization in the Design of Experiments," International Statistical Review, International Statistical Institute, vol. 77(3), pages 415-429, December.
    Full references (including those not matched with items on IDEAS)


    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.

    Cited by:

    1. Hengtao Zhang & Guosheng Yin, 2021. "Response‐adaptive rerandomization," Journal of the Royal Statistical Society Series C, Royal Statistical Society, vol. 70(5), pages 1281-1298, November.
    2. Pedro Carneiro & Sokbae Lee & Daniel Wilhelm, 2020. "Optimal data collection for randomized control trials [Microcredit impacts: Evidence from a randomized microcredit program placement experiment by Compartamos Banco]," The Econometrics Journal, Royal Economic Society, vol. 23(1), pages 1-31.
    3. James J. Heckman & Ganesh Karapakula, 2019. "The Perry Preschoolers at Late Midlife: A Study in Design-Specific Inference," Working Papers 2019-034, Human Capital and Economic Opportunity Working Group.
    4. Nicole E. Pashley & Luke W. Miratrix, 2022. "Block What You Can, Except When You Shouldn’t," Journal of Educational and Behavioral Statistics, , vol. 47(1), pages 69-100, February.
    5. Baosheng Liang & Peng Wu & Xingwei Tong & Yanping Qiu, 2020. "Regression and subgroup detection for heterogeneous samples," Computational Statistics, Springer, vol. 35(4), pages 1853-1878, December.
    6. James J Heckman & Ganesh Karapakula, 2021. "Using a satisficing model of experimenter decision-making to guide finite-sample inference for compromised experiments [Sampling-based versus design-based uncertainty in regression analysis]," The Econometrics Journal, Royal Economic Society, vol. 24(2), pages 1-39.
    7. Sylvain Chassang & Rong Feng, 2020. "The Cost of Imbalance in Clinical Trials," Working Papers 2020-12, Princeton University. Economics Department..
    8. Quan Zhou & Philip A Ernst & Kari Lock Morgan & Donald B Rubin & Anru Zhang, 2018. "Sequential rerandomization," Biometrika, Biometrika Trust, vol. 105(3), pages 745-752.
    9. Yuehao Bai, 2022. "Optimality of Matched-Pair Designs in Randomized Controlled Trials," Papers 2206.07845,
    10. Yves Tillé, 2022. "Some Solutions Inspired by Survey Sampling Theory to Build Effective Clinical Trials," International Statistical Review, International Statistical Institute, vol. 90(3), pages 481-498, December.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Zhenzhen Xu & John D. Kalbfleisch, 2013. "Repeated Randomization and Matching in Multi-Arm Trials," Biometrics, The International Biometric Society, vol. 69(4), pages 949-959, December.
    2. Yasemin Kisbu-Sakarya & Thomas D. Cook & Yang Tang & M. H. Clark, 2018. "Comparative Regression Discontinuity: A Stress Test With Small Samples," Evaluation Review, , vol. 42(1), pages 111-143, February.
    3. Goldberg, Matthew H., 2019. "How often does random assignment fail? Estimates and recommendations," OSF Preprints s2j4r, Center for Open Science.
    4. Heissel, Jennifer, 2016. "The relative benefits of live versus online delivery: Evidence from virtual algebra I in North Carolina," Economics of Education Review, Elsevier, vol. 53(C), pages 99-115.
    5. Katherine Baicker & Theodore Svoronos, 2019. "Testing the Validity of the Single Interrupted Time Series Design," NBER Working Papers 26080, National Bureau of Economic Research, Inc.
    6. Deborah Peikes & Grace Anglin & Erin Fries Taylor & Stacy Dale & Ann O'Malley & Arkadipta Ghosh & Kaylyn Swankoski & Lara Converse & Rosalind Keith & Mariel Finucane & Jesse Crosson & Anne Mutti & Tho, "undated". "Evaluation of the Comprehensive Primary Care Initiative: Third Annual Report," Mathematica Policy Research Reports 70714de1cb3d4620a5957f68d, Mathematica Policy Research.
    7. Nikolova, Milena & Graham, Carol, 2015. "In transit: The well-being of migrants from transition and post-transition countries," Journal of Economic Behavior & Organization, Elsevier, vol. 112(C), pages 164-186.
    8. Colin Cannonier, 2009. "State Abstinence Education Programs and Teen Fertility in the U.S," Departmental Working Papers 2009-14, Department of Economics, Louisiana State University.
    9. Terri J. Sabol & Robert C. Pianta, 2014. "Do Standard Measures of Preschool Quality Used in Statewide Policy Predict School Readiness?," Education Finance and Policy, MIT Press, vol. 9(2), pages 116-164, March.
    10. Ethan J. Raker, 2020. "Natural Hazards, Disasters, and Demographic Change: The Case of Severe Tornadoes in the United States, 1980–2010," Demography, Springer;Population Association of America (PAA), vol. 57(2), pages 653-674, April.
    11. Xiaokang Luo & Tirthankar Dasgupta & Minge Xie & Regina Y. Liu, 2021. "Leveraging the Fisher randomization test using confidence distributions: Inference, combination and fusion learning," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 83(4), pages 777-797, September.
    12. Kenneth Fortson & Natalya Verbitsky-Savitz & Emma Kopa & Philip Gleason, 2012. "Using an Experimental Evaluation of Charter Schools to Test Whether Nonexperimental Comparison Group Methods Can Replicate Experimental Impact Estimates," Mathematica Policy Research Reports 27f871b5b7b94f3a80278a593, Mathematica Policy Research.
    13. Rezaei, Ehsan Eyshi & Gaiser, Thomas, 2017. "Change in crop management strategies could double the maize yield in Africa," Discussion Papers 260154, University of Bonn, Center for Development Research (ZEF).
    14. Jean Stockard, 2013. "Merging the accountability and scientific research requirements of the No Child Left Behind Act: using cohort control groups," Quality & Quantity: International Journal of Methodology, Springer, vol. 47(4), pages 2225-2257, June.
    15. Paul Burkander & Nan Maxwell & Menbere Shiferaw & Matt Jacobus & Alma Vigil & Charles Tilley & Alicia Harrington & Erin Dillon & Hande Inanc & Peter Schochet, "undated". "Building College and Career Pathways for High School Students: Youth CareerConnect, Technical Report for the Impact Study," Mathematica Policy Research Reports c0b91a97d65342eb82cf8eb7f, Mathematica Policy Research.
    16. Mathur, Maya B & VanderWeele, Tyler, 2021. "Methods to address confounding and other biases in meta-analyses: Review and recommendations," OSF Preprints v7dtq, Center for Open Science.
    17. Ferraro, Paul J. & Miranda, Juan José, 2014. "The performance of non-experimental designs in the evaluation of environmental programs: A design-replication study using a large-scale randomized experiment as a benchmark," Journal of Economic Behavior & Organization, Elsevier, vol. 107(PA), pages 344-365.
    18. Aloyce R. Kaliba & Anne G. Gongwe & Kizito Mazvimavi & Ashagre Yigletu, 2021. "Impact of Adopting Improved Seeds on Access to Broader Food Groups Among Small-Scale Sorghum Producers in Tanzania," SAGE Open, , vol. 11(1), pages 21582440209, January.
    19. Hoang, Anh Tuan & Sandro Nižetić, & Olcer, Aykut I. & Ong, Hwai Chyuan & Chen, Wei-Hsin & Chong, Cheng Tung & Thomas, Sabu & Bandh, Suhaib A. & Nguyen, Xuan Phuong, 2021. "Impacts of COVID-19 pandemic on the global energy system and the shift progress to renewable energy: Opportunities, challenges, and policy implications," Energy Policy, Elsevier, vol. 154(C).
    20. Samantha Marie Schenck, 2021. "Assessing the Employment Effects of California’s Paid Family Leave Program," Eastern Economic Journal, Palgrave Macmillan;Eastern Economic Association, vol. 47(3), pages 406-429, June.

    More about this item


    Access and download statistics


    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:taf:jnlasa:v:110:y:2015:i:512:p:1412-1421. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: . General contact details of provider: .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Chris Longhurst (email available below). General contact details of provider: .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.