IDEAS home Printed from https://ideas.repec.org/a/sae/evarev/v42y2018i1p71-110.html
   My bibliography  Save this article

Statistical Power for the Comparative Regression Discontinuity Design With a Pretest No-Treatment Control Function: Theory and Evidence From the National Head Start Impact Study

Author

Listed:
  • Yang Tang
  • Thomas D. Cook

Abstract

The basic regression discontinuity design (RDD) has less statistical power than a randomized control trial (RCT) with the same sample size. Adding a no-treatment comparison function to the basic RDD creates a comparative RDD (CRD); and when this function comes from the pretest value of the study outcome, a CRD-Pre design results. We use a within-study comparison (WSC) to examine the power of CRD-Pre relative to both basic RDD and RCT. We first build the theoretical foundation for power in CRD-Pre, then derive the relevant variance formulae, and finally compare them to the theoretical RCT variance. We conclude from this theoretical part of this article that (1) CRD-Pre’s power gain depends on the partial correlation between the pretest and posttest measures after conditioning on the assignment variable, (2) CRD-Pre is less responsive than basic RDD to how the assignment variable is distributed and where the cutoff is located, and (3) under a variety of conditions, the efficiency of CRD-Pre is very close to that of the RCT. Data from the National Head Start Impact Study are then used to construct RCT, RDD, and CRD-Pre designs and to compare their power. The empirical results indicate (1) a high level of correspondence between the predicted and obtained power results for RDD and CRD-Pre relative to the RCT, and (2) power levels in CRD-Pre and RCT that are very close. The study is unique among WSCs for its focus on the correspondence between RCT and observational study standard errors rather than means.

Suggested Citation

  • Yang Tang & Thomas D. Cook, 2018. "Statistical Power for the Comparative Regression Discontinuity Design With a Pretest No-Treatment Control Function: Theory and Evidence From the National Head Start Impact Study," Evaluation Review, , vol. 42(1), pages 71-110, February.
  • Handle: RePEc:sae:evarev:v:42:y:2018:i:1:p:71-110
    DOI: 10.1177/0193841X18776117
    as

    Download full text from publisher

    File URL: https://journals.sagepub.com/doi/10.1177/0193841X18776117
    Download Restriction: no

    File URL: https://libkey.io/10.1177/0193841X18776117?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Imbens, Guido W. & Lemieux, Thomas, 2008. "Regression discontinuity designs: A guide to practice," Journal of Econometrics, Elsevier, vol. 142(2), pages 615-635, February.
    2. Duncan D. Chaplin & Thomas D. Cook & Jelena Zurovac & Jared S. Coopersmith & Mariel M. Finucane & Lauren N. Vollmer & Rebecca E. Morris, 2018. "The Internal And External Validity Of The Regression Discontinuity Design: A Meta‐Analysis Of 15 Within‐Study Comparisons," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 37(2), pages 403-429, March.
    3. Doug Miller & Jens Ludwig, 2005. "Does Head Start Improve Children?s Life Chances? Evidence from a Regression Discontinuity Design," Working Papers 534, University of California, Davis, Department of Economics.
    4. Jens Ludwig & Douglas L. Miller, 2007. "Does Head Start Improve Children's Life Chances? Evidence from a Regression Discontinuity Design," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 122(1), pages 159-208.
    5. Joshua D. Angrist & Miikka Rokkanen, 2015. "Wanna Get Away? Regression Discontinuity Estimation of Exam School Effects Away From the Cutoff," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 110(512), pages 1331-1344, December.
    6. Hahn, Jinyong & Todd, Petra & Van der Klaauw, Wilbert, 2001. "Identification and Estimation of Treatment Effects with a Regression-Discontinuity Design," Econometrica, Econometric Society, vol. 69(1), pages 201-209, January.
    7. Rebecca A. Maynard & Kenneth A. Couch & Coady Wing & Thomas D. Cook, 2013. "Strengthening The Regression Discontinuity Design Using Additional Design Elements: A Within‐Study Comparison," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 32(4), pages 853-877, September.
    8. Doug Miller & Jens Ludwig, 2005. "Does Head Start Improve Children?s Life Chances? Evidence from a Regression Discontinuity Design," Working Papers 54, University of California, Davis, Department of Economics.
    9. Thomas D. Cook & William R. Shadish & Vivian C. Wong, 2008. "Three conditions under which experiments and observational studies produce comparable causal estimates: New findings from within-study comparisons," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 27(4), pages 724-750.
    10. repec:mpr:mprres:5863 is not listed on IDEAS
    11. Yang Tang & Thomas D. Cook & Yasemin Kisbu-Sakarya & Heinrich Hock & Hanley Chiang, 2017. "The Comparative Regression Discontinuity (CRD) Design: An Overview and Demonstration of its Performance Relative to Basic RD and the Randomized Experiment," Advances in Econometrics, in: Regression Discontinuity Designs, volume 38, pages 237-279, Emerald Group Publishing Limited.
    12. Peter Z. Schochet, "undated". "Statistical Power for Random Assignment Evaluations of Education Programs," Mathematica Policy Research Reports 6749d31ad72d4acf988f7dce5, Mathematica Policy Research.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Kettlewell, Nathan & Siminski, Peter, 2020. "Optimal Model Selection in RDD and Related Settings Using Placebo Zones," IZA Discussion Papers 13639, Institute of Labor Economics (IZA).
    2. Philip Gleason & Alexandra Resch & Jillian Berk, 2018. "RD or Not RD: Using Experimental Studies to Assess the Performance of the Regression Discontinuity Approach," Evaluation Review, , vol. 42(1), pages 3-33, February.
    3. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    4. Maria Knoth Humlum & Rune Majlund Vejlin, 2013. "The Responses Of Youth To A Cash Transfer Conditional On Schooling: A Quasi‐Experimental Study," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 28(4), pages 628-649, June.
    5. Porter, Jack & Yu, Ping, 2015. "Regression discontinuity designs with unknown discontinuity points: Testing and estimation," Journal of Econometrics, Elsevier, vol. 189(1), pages 132-147.
    6. Daniel Muller & Lionel Page, 2016. "Born leaders: political selection and the relative age effect in the US Congress," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 179(3), pages 809-829, June.
    7. Marinho Bertanha & Guido W. Imbens, 2020. "External Validity in Fuzzy Regression Discontinuity Designs," Journal of Business & Economic Statistics, Taylor & Francis Journals, vol. 38(3), pages 593-612, July.
    8. Barrera-Osorio, Felipe & Raju, Dhushyanth, 2011. "Evaluating public per-student subsidies to low-cost private schools : regression-discontinuity evidence from Pakistan," Policy Research Working Paper Series 5638, The World Bank.
    9. de Lazzer, Jakob, 2016. "Non-monotonic Selection Issues in Electoral Regression Discontinuity Designs," VfS Annual Conference 2016 (Augsburg): Demographic Change 145845, Verein für Socialpolitik / German Economic Association.
    10. Vivian C. Wong & Peter M. Steiner & Kylie L. Anglin, 2018. "What Can Be Learned From Empirical Evaluations of Nonexperimental Methods?," Evaluation Review, , vol. 42(2), pages 147-175, April.
    11. Chiang, Hanley, 2009. "How accountability pressure on failing schools affects student achievement," Journal of Public Economics, Elsevier, vol. 93(9-10), pages 1045-1057, October.
    12. Yoichi Arai & Hidehiko Ichimura, 2018. "Simultaneous selection of optimal bandwidths for the sharp regression discontinuity estimator," Quantitative Economics, Econometric Society, vol. 9(1), pages 441-482, March.
    13. Yoichi Arai & Hidehiko Ichimura, 2013. "Optimal Bandwidth Selection for Differences of Nonparametric Estimators with an Application to the Sharp Regression Discontinuity Design," CIRJE F-Series CIRJE-F-889, CIRJE, Faculty of Economics, University of Tokyo.
    14. David S. Lee & Thomas Lemieux, 2010. "Regression Discontinuity Designs in Economics," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 281-355, June.
    15. Ari Hyytinen & Jaakko Meriläinen & Tuukka Saarimaa & Otto Toivanen & Janne Tukiainen, 2018. "When does regression discontinuity design work? Evidence from random election outcomes," Quantitative Economics, Econometric Society, vol. 9(2), pages 1019-1051, July.
    16. Matias D. Cattaneo & Luke Keele & Rocío Titiunik & Gonzalo Vazquez-Bare, 2021. "Extrapolating Treatment Effects in Multi-Cutoff Regression Discontinuity Designs," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 116(536), pages 1941-1952, October.
    17. Miron Tequame & Nyasha Tirivayi, 2015. "Higher education and fertility: Evidence from a natural experiment in Ethiopia," CINCH Working Paper Series 1509, Universitaet Duisburg-Essen, Competent in Competition and Health, revised Aug 2015.
    18. Hong Kai, 2017. "School Bond Referendum, Capital Expenditure, and Student Achievement," The B.E. Journal of Economic Analysis & Policy, De Gruyter, vol. 17(4), pages 1-26, October.
    19. Imbens, Guido W. & Lemieux, Thomas, 2008. "Regression discontinuity designs: A guide to practice," Journal of Econometrics, Elsevier, vol. 142(2), pages 615-635, February.
    20. Benjamin M. Marx & Lesley J. Turner, 2015. "Borrowing Trouble? Student Loans, the Cost of Borrowing, and Implications for the Effectiveness of Need-Based Grant Aid," NBER Working Papers 20850, National Bureau of Economic Research, Inc.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:sae:evarev:v:42:y:2018:i:1:p:71-110. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: SAGE Publications (email available below). General contact details of provider: .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.