IDEAS home Printed from https://ideas.repec.org/a/eee/ecoedu/v44y2015icp100-113.html
   My bibliography  Save this article

Horseshoes, hand grenades, and treatment effects? Reassessing whether nonexperimental estimators are biased

Author

Listed:
  • Fortson, Kenneth
  • Gleason, Philip
  • Kopa, Emma
  • Verbitsky-Savitz, Natalya

Abstract

Randomized controlled trials (RCTs) are considered the gold standard in estimating treatment effects. When an RCT is infeasible, regression modeling or statistical matching are often used instead. Nonexperimental methods such as these could produce unbiased estimates if the underlying assumptions hold, but those assumptions are usually not testable. Most prior studies testing nonexperimental designs find that they fail to produce unbiased estimates, but these studies have examined weaker evaluation designs. The present study addresses these limitations using student-level data based on a large-scale RCT of charter schools for which standardized achievement tests are the key outcome measure. The use of baseline data that are strongly predictive of the key outcome measures considerably reduces but might not completely eliminate bias.

Suggested Citation

  • Fortson, Kenneth & Gleason, Philip & Kopa, Emma & Verbitsky-Savitz, Natalya, 2015. "Horseshoes, hand grenades, and treatment effects? Reassessing whether nonexperimental estimators are biased," Economics of Education Review, Elsevier, vol. 44(C), pages 100-113.
  • Handle: RePEc:eee:ecoedu:v:44:y:2015:i:c:p:100-113
    DOI: 10.1016/j.econedurev.2014.11.001
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0272775714001022
    Download Restriction: Full text for ScienceDirect subscribers only

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. LaLonde, Robert J, 1986. "Evaluating the Econometric Evaluations of Training Programs with Experimental Data," American Economic Review, American Economic Association, vol. 76(4), pages 604-620, September.
    2. James J. Heckman & Petra E. Todd, 2009. "A note on adapting propensity score matching and selection models to choice based samples," Econometrics Journal, Royal Economic Society, vol. 12(s1), pages 230-234, January.
    3. A. Smith, Jeffrey & E. Todd, Petra, 2005. "Does matching overcome LaLonde's critique of nonexperimental estimators?," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 305-353.
    4. Robert Bifulco & Helen F. Ladd, 2006. "The Impacts of Charter Schools on Student Achievement: Evidence from North Carolina," Education Finance and Policy, MIT Press, vol. 1(1), pages 50-90, January.
    5. repec:mpr:mprres:3694 is not listed on IDEAS
    6. Steven Glazerman & Dan M. Levy & David Myers, "undated". "Nonexperimental Versus Experimental Estimates of Earnings Impacts," Mathematica Policy Research Reports 7c8bd68ac8db47caa57c70ee1, Mathematica Policy Research.
    7. Christina Clark Tuttle & Bing-ru Teh & Ira Nichols-Barrer & Brian P. Gill & Philip Gleason, 2010. "Student Characteristics and Achievement in 22 KIPP Middle Schools," Mathematica Policy Research Reports 69064a347d534ffa8947d7b6e, Mathematica Policy Research.
    8. Michael Lechner, 2000. "An Evaluation of Public-Sector-Sponsored Continuous Vocational Training Programs in East Germany," Journal of Human Resources, University of Wisconsin Press, vol. 35(2), pages 347-375.
    9. Friedlander, Daniel & Robins, Philip K, 1995. "Evaluating Program Evaluations: New Evidence on Commonly Used Nonexperimental Methods," American Economic Review, American Economic Association, vol. 85(4), pages 923-937, September.
    10. Philip Gleason & Melissa Clark & Christina Clark Tuttle & Emily Dwoyer, 2010. "The Evaluation of Charter School Impacts," Mathematica Policy Research Reports 3066da11915a4b04a77b38848, Mathematica Policy Research.
    11. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    12. repec:mpr:mprres:7293 is not listed on IDEAS
    13. Alberto Abadie & Guido W. Imbens, 2008. "On the Failure of the Bootstrap for Matching Estimators," Econometrica, Econometric Society, vol. 76(6), pages 1537-1557, November.
    14. Peikes, Deborah N. & Moreno, Lorenzo & Orzol, Sean Michael, 2008. "Propensity Score Matching: A Note of Caution for Evaluators of Social Programs," The American Statistician, American Statistical Association, vol. 62, pages 222-231, August.
    15. Atila Abdulkadiroğlu & Joshua D. Angrist & Susan M. Dynarski & Thomas J. Kane & Parag A. Pathak, 2011. "Accountability and Flexibility in Public Schools: Evidence from Boston's Charters And Pilots," The Quarterly Journal of Economics, Oxford University Press, vol. 126(2), pages 699-748.
    16. Shadish, William R. & Clark, M. H. & Steiner, Peter M., 2008. "Can Nonrandomized Experiments Yield Accurate Answers? A Randomized Experiment Comparing Random and Nonrandom Assignments," Journal of the American Statistical Association, American Statistical Association, vol. 103(484), pages 1334-1344.
    17. repec:mpr:mprres:6720 is not listed on IDEAS
    18. repec:mpr:mprres:6673 is not listed on IDEAS
    19. Kenneth A. Couch & Robert Bifulco, 2012. "Can Nonexperimental Estimates Replicate Estimates Based on Random Assignment in Evaluations of School Choice? A Within‐Study Comparison," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 31(3), pages 729-751, June.
    20. Justine S. Hastings & Christopher A. Neilson & Seth D. Zimmerman, 2012. "The Effect of School Choice on Intrinsic Motivation and Academic Outcomes," NBER Working Papers 18324, National Bureau of Economic Research, Inc.
    21. Elizabeth Ty Wilde & Robinson Hollister, 2007. "How close is close enough? Evaluating propensity score matching using data from a class size reduction experiment," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 26(3), pages 455-477.
    22. Philip Gleason & Melissa Clark & Christina Clark Tuttle & Emily Dwoyer, 2010. "The Evaluation of Charter School Impacts (Presentation)," Mathematica Policy Research Reports 770e250b2ef343a3b1ec8c932, Mathematica Policy Research.
    23. Thomas D. Cook & William R. Shadish & Vivian C. Wong, 2008. "Three conditions under which experiments and observational studies produce comparable causal estimates: New findings from within-study comparisons," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 27(4), pages 724-750.
    24. Deborah N. Peikes & Lorenzo Moreno & Sean Michael Orzol, "undated". "Propensity Score Matching: A Note of Caution for Evaluators of Social Programs," Mathematica Policy Research Reports dd0866e4646a4e0ea77079d5b, Mathematica Policy Research.
    25. Roberto Agodini & Mark Dynarski, 2004. "Are Experiments the Only Option? A Look at Dropout Prevention Programs," The Review of Economics and Statistics, MIT Press, vol. 86(1), pages 180-194, February.
    26. Will Dobbie & Roland G. Fryer, 2011. "Are High-Quality Schools Enough to Increase Achievement among the Poor? Evidence from the Harlem Children's Zone," American Economic Journal: Applied Economics, American Economic Association, vol. 3(3), pages 158-187, July.
    27. repec:mpr:mprres:6676 is not listed on IDEAS
    28. Thomas Fraker & Rebecca Maynard, 1987. "The Adequacy of Comparison Group Designs for Evaluations of Employment-Related Programs," Journal of Human Resources, University of Wisconsin Press, vol. 22(2), pages 194-227.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Persson, Emma & Persson, Sofie & Gerdtham, Ulf-G. & Steen Carlsson, Katarina, 2016. "Effect of Type 1 Diabetes on School Performance in a Dynamic World: New Analysis Exploring Swedish Register Data," Working Papers 2016:28, Lund University, Department of Economics.
    2. Zhang, Chunqin & Juan, Zhicai & Xiao, Guangnian, 2015. "Do contractual practices affect technical efficiency? Evidence from public transport operators in China," Transportation Research Part E: Logistics and Transportation Review, Elsevier, vol. 80(C), pages 39-55.

    More about this item

    Keywords

    Treatment effects; Randomized controlled trials; Nonexperimental methods; Within-study comparison;

    JEL classification:

    • I21 - Health, Education, and Welfare - - Education - - - Analysis of Education
    • C10 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - General
    • C18 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Methodolical Issues: General
    • C21 - Mathematical and Quantitative Methods - - Single Equation Models; Single Variables - - - Cross-Sectional Models; Spatial Models; Treatment Effect Models

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:ecoedu:v:44:y:2015:i:c:p:100-113. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Dana Niculescu). General contact details of provider: http://www.elsevier.com/locate/econedurev .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.