IDEAS home Printed from https://ideas.repec.org/p/max/cprwps/124.html
   My bibliography  Save this paper

Can Propensity Score Analysis Replicate Estimates Based on Random Assignment in Evaluations of School Choice? A Within-Study Comparison

Author

Abstract

The ability of propensity score analysis (PSA) to match impact estimates derived from random assignment (RA) is examined using data from the evaluation of two interdistrict magnet schools. As in previous within study comparisons, the estimates provided by PSA and RA differ substantially when PSA is implemented using comparison groups that are not similar to the treatment group and without pretreatment measures of academic performance. Adding pretreatment measures of the performance to the PSA, however, substantially improves the match between PSA and RA estimates. Although the results should not be generalized too readily, they suggest that nonexperimental estimators can, in some circumstances, provide valid estimates of the causal impact of school choice programs.

Suggested Citation

  • Robert Bifulco, 2010. "Can Propensity Score Analysis Replicate Estimates Based on Random Assignment in Evaluations of School Choice? A Within-Study Comparison," Center for Policy Research Working Papers 124, Center for Policy Research, Maxwell School, Syracuse University.
  • Handle: RePEc:max:cprwps:124
    as

    Download full text from publisher

    File URL: https://surface.syr.edu/cpr/167/
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Elizabeth Ty Wilde & Robinson Hollister, 2007. "How close is close enough? Evaluating propensity score matching using data from a class size reduction experiment," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 26(3), pages 455-477.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. repec:mpr:mprres:7808 is not listed on IDEAS
    2. Kenneth Fortson & Natalya Verbitsky-Savitz & Emma Kopa & Philip Gleason, 2012. "Using an Experimental Evaluation of Charter Schools to Test Whether Nonexperimental Comparison Group Methods Can Replicate Experimental Impact Estimates," Mathematica Policy Research Reports 27f871b5b7b94f3a80278a593, Mathematica Policy Research.
    3. repec:mpr:mprres:7443 is not listed on IDEAS
    4. Ron Zimmer & Brian Gill & Jonathon Attridge & Kaitlin Obenauf, 2014. "Charter School Authorizers and Student Achievement," Education Finance and Policy, MIT Press, vol. 9(1), pages 59-85, January.
    5. Ron Zimmer & Brian Gill & Jonathon Attridge & Kaitlin Obenauf, "undated". "Charter School Authorizers and Student Achievement (Journal Article)," Mathematica Policy Research Reports 6e4664294f7341868c9a78142, Mathematica Policy Research.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Gonzalo Nunez-Chaim & Henry G. Overman & Capucine Riom, 2024. "Does subsidising business advice improve firm performance? Evidence from a large RCT," CEP Discussion Papers dp1977, Centre for Economic Performance, LSE.
    2. Kenneth Fortson & Natalya Verbitsky-Savitz & Emma Kopa & Philip Gleason, 2012. "Using an Experimental Evaluation of Charter Schools to Test Whether Nonexperimental Comparison Group Methods Can Replicate Experimental Impact Estimates," Mathematica Policy Research Reports 27f871b5b7b94f3a80278a593, Mathematica Policy Research.
    3. Luke Byrne Willard, 2012. "Does inflation targeting matter? A reassessment," Applied Economics, Taylor & Francis Journals, vol. 44(17), pages 2231-2244, June.
    4. Ferraro, Paul J. & Miranda, Juan José, 2014. "The performance of non-experimental designs in the evaluation of environmental programs: A design-replication study using a large-scale randomized experiment as a benchmark," Journal of Economic Behavior & Organization, Elsevier, vol. 107(PA), pages 344-365.
    5. Nianbo Dong & Mark W. Lipsey, 2018. "Can Propensity Score Analysis Approximate Randomized Experiments Using Pretest and Demographic Information in Pre-K Intervention Research?," Evaluation Review, , vol. 42(1), pages 34-70, February.
    6. David A. Freedman & Richard A. Berk, 2008. "Weighting Regressions by Propensity Scores," Evaluation Review, , vol. 32(4), pages 392-409, August.
    7. David A. Freedman, 2009. "Limits of Econometrics," International Econometric Review (IER), Econometric Research Association, vol. 1(1), pages 5-17, April.
    8. Nianbo Dong & Elizabeth A. Stuart & David Lenis & Trang Quynh Nguyen, 2020. "Using Propensity Score Analysis of Survey Data to Estimate Population Average Treatment Effects: A Case Study Comparing Different Methods," Evaluation Review, , vol. 44(1), pages 84-108, February.
    9. Kenneth Fortson & Philip Gleason & Emma Kopa & Natalya Verbitsky-Savitz, "undated". "Horseshoes, Hand Grenades, and Treatment Effects? Reassessing Bias in Nonexperimental Estimators," Mathematica Policy Research Reports 1c24988cd5454dd3be51fbc2c, Mathematica Policy Research.
    10. David J. Harding & Lisa Sanbonmatsu & Greg J. Duncan & Lisa A. Gennetian & Lawrence F. Katz & Ronald C. Kessler & Jeffrey R. Kling & Matthew Sciandra & Jens Ludwig, 2023. "Evaluating Contradictory Experimental and Nonexperimental Estimates of Neighborhood Effects on Economic Outcomes for Adults," Housing Policy Debate, Taylor & Francis Journals, vol. 33(2), pages 453-486, March.
    11. Katz, Lawrence & Duncan, Greg J. & Kling, Jeffrey R. & Kessler, Ronald C. & Ludwig, Jens & Sanbonmatsu, Lisa & Liebman, Jeffrey B., 2008. "What Can We Learn about Neighborhood Effects from the Moving to Opportunity Experiment?," Scholarly Articles 2766959, Harvard University Department of Economics.
    12. Jared Coopersmith & Thomas D. Cook & Jelena Zurovac & Duncan Chaplin & Lauren V. Forrow, 2022. "Internal And External Validity Of The Comparative Interrupted Time‐Series Design: A Meta‐Analysis," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 41(1), pages 252-277, January.
    13. Fatih Unlu & Douglas Lee Lauen & Sarah Crittenden Fuller & Tiffany Berglund & Elc Estrera, 2021. "Can Quasi‐Experimental Evaluations That Rely On State Longitudinal Data Systems Replicate Experimental Results?," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 40(2), pages 572-613, March.
    14. Fortson, Kenneth & Gleason, Philip & Kopa, Emma & Verbitsky-Savitz, Natalya, 2015. "Horseshoes, hand grenades, and treatment effects? Reassessing whether nonexperimental estimators are biased," Economics of Education Review, Elsevier, vol. 44(C), pages 100-113.
    15. Barnow, Burt S., 2010. "Setting up social experiments: the good, the bad, and the ugly," Zeitschrift für ArbeitsmarktForschung - Journal for Labour Market Research, Institut für Arbeitsmarkt- und Berufsforschung (IAB), Nürnberg [Institute for Employment Research, Nuremberg, Germany], vol. 43(2), pages 91-105.
    16. Katherine Baicker & Theodore Svoronos, 2019. "Testing the Validity of the Single Interrupted Time Series Design," CID Working Papers 364, Center for International Development at Harvard University.
    17. Andrew P. Jaciw, 2016. "Applications of a Within-Study Comparison Approach for Evaluating Bias in Generalized Causal Inferences From Comparison Groups Studies," Evaluation Review, , vol. 40(3), pages 241-276, June.
    18. Barnow, Burt S., 2010. "Setting up social experiments: the good, the bad, and the ugly," Zeitschrift für ArbeitsmarktForschung - Journal for Labour Market Research, Institut für Arbeitsmarkt- und Berufsforschung (IAB), Nürnberg [Institute for Employment Research, Nuremberg, Germany], vol. 43(2), pages 91-105.
    19. Katherine Baicker & Theodore Svoronos, 2019. "Testing the Validity of the Single Interrupted Time Series Design," NBER Working Papers 26080, National Bureau of Economic Research, Inc.
    20. Ben Weidmann & Luke Miratrix, 2021. "Lurking Inferential Monsters? Quantifying Selection Bias In Evaluations Of School Programs," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 40(3), pages 964-986, June.

    More about this item

    Keywords

    Nonexperimental; quasi-experimental; propensity score analysis; design replication; school choice.;
    All these keywords.

    JEL classification:

    • C21 - Mathematical and Quantitative Methods - - Single Equation Models; Single Variables - - - Cross-Sectional Models; Spatial Models; Treatment Effect Models
    • I21 - Health, Education, and Welfare - - Education - - - Analysis of Education
    • C90 - Mathematical and Quantitative Methods - - Design of Experiments - - - General
    • I28 - Health, Education, and Welfare - - Education - - - Government Policy

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:max:cprwps:124. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Margaret Austin or Zia Jackson or Katrina Fiacchi (email available below). General contact details of provider: https://edirc.repec.org/data/cpsyrus.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.