IDEAS home Printed from https://ideas.repec.org/a/bla/jorssa/v183y2020i4p1333-1362.html
   My bibliography  Save this article

Using hidden information and performance level boundaries to study student–teacher assignments: implications for estimating teacher causal effects

Author

Listed:
  • J. R. Lockwood
  • D. McCaffrey

Abstract

A common problem in educational evaluation is estimating causal effects of interventions from non‐experimental data on students. Scores from standardized achievement tests often are used to adjust for differences in background characteristics of students in different non‐experimental groups. An open question is whether, and how, these adjustments should account for the errors in test scores as measures of latent achievement. The answer depends on what information was used to assign students to non‐experimental groups. Using a case‐study of estimating teacher effects on student achievement, we develop two novel empirical tests about what information is used to assign students to teachers. We demonstrate that assignments are influenced by both information that is unobserved by the researcher, and error prone test scores. We develop a model that is appropriate for this complex selection mechanism and compare its results with common simpler estimators. We discuss implications for the broader problem of causal modelling with error prone confounders.

Suggested Citation

  • J. R. Lockwood & D. McCaffrey, 2020. "Using hidden information and performance level boundaries to study student–teacher assignments: implications for estimating teacher causal effects," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 183(4), pages 1333-1362, October.
  • Handle: RePEc:bla:jorssa:v:183:y:2020:i:4:p:1333-1362
    DOI: 10.1111/rssa.12533
    as

    Download full text from publisher

    File URL: https://doi.org/10.1111/rssa.12533
    Download Restriction: no

    File URL: https://libkey.io/10.1111/rssa.12533?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Jesse Rothstein, 2010. "Teacher Quality in Educational Production: Tracking, Decay, and Student Achievement," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 125(1), pages 175-214.
    2. Tahir Andrabi & Jishnu Das & Asim Ijaz Khwaja & Tristan Zajonc, 2011. "Do Value-Added Estimates Add Value? Accounting for Learning Dynamics," American Economic Journal: Applied Economics, American Economic Association, vol. 3(3), pages 29-54, July.
    3. Jesse Rothstein, 2009. "Student Sorting and Bias in Value-Added Estimation: Selection on Observables and Unobservables," Education Finance and Policy, MIT Press, vol. 4(4), pages 537-571, October.
    4. R. Bock & Murray Aitkin, 1981. "Marginal maximum likelihood estimation of item parameters: Application of an EM algorithm," Psychometrika, Springer;The Psychometric Society, vol. 46(4), pages 443-459, December.
    5. Hong, Guanglei & Raudenbush, Stephen W., 2006. "Evaluating Kindergarten Retention Policy: A Case Study of Causal Inference for Multilevel Observational Data," Journal of the American Statistical Association, American Statistical Association, vol. 101, pages 901-910, September.
    6. Raj Chetty & John N. Friedman & Jonah E. Rockoff, 2014. "Measuring the Impacts of Teachers I: Evaluating Bias in Teacher Value-Added Estimates," American Economic Review, American Economic Association, vol. 104(9), pages 2593-2632, September.
    7. Peter M. Steiner & Thomas D. Cook & William R. Shadish, 2011. "On the Importance of Reliable Covariate Measurement in Selection Bias Adjustments Using Propensity Scores," Journal of Educational and Behavioral Statistics, , vol. 36(2), pages 213-236, April.
    8. Antony Fielding & Min Yang, 2005. "Generalized linear mixed models for ordered responses in complex multilevel structures: effects beneath the school or college in education," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 168(1), pages 159-183, January.
    9. Hudgens, Michael G. & Halloran, M. Elizabeth, 2008. "Toward Causal Inference With Interference," Journal of the American Statistical Association, American Statistical Association, vol. 103, pages 832-842, June.
    10. Brian Junker & Lynne Schofield & Lowell Taylor, 2012. "The use of cognitive ability measures as explanatory variables in regression analysis," IZA Journal of Labor Economics, Springer;Forschungsinstitut zur Zukunft der Arbeit GmbH (IZA), vol. 1(1), pages 1-19, December.
    11. Ding, Weili & Lehrer, Steven F., 2014. "Understanding the role of time-varying unobserved ability heterogeneity in education production," Economics of Education Review, Elsevier, vol. 40(C), pages 55-75.
    12. Guido W. Imbens, 2004. "Nonparametric Estimation of Average Treatment Effects Under Exogeneity: A Review," The Review of Economics and Statistics, MIT Press, vol. 86(1), pages 4-29, February.
    13. Steven Dieterle & Cassandra M. Guarino & Mark D. Reckase & Jeffrey M. Wooldridge, 2015. "How do Principals Assign Students to Teachers? Finding Evidence in Administrative Data and the Implications for Value Added," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 34(1), pages 32-58, January.
    14. Cheti Nicoletti & Birgitta Rabe, 2018. "The effect of school spending on student achievement: addressing biases in value‐added models," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 181(2), pages 487-515, February.
    15. Jean-Paul Fox & Cees Glas, 2003. "Bayesian modeling of measurement error in predictor variables using item response theory," Psychometrika, Springer;The Psychometric Society, vol. 68(2), pages 169-191, June.
    16. Silvia Bianconcini & Silvia Cagnone, 2012. "A General Multivariate Latent Growth Model With Applications to Student Achievement," Journal of Educational and Behavioral Statistics, , vol. 37(2), pages 339-364, April.
    17. Chalmers, R. Philip, 2012. "mirt: A Multidimensional Item Response Theory Package for the R Environment," Journal of Statistical Software, Foundation for Open Access Statistics, vol. 48(i06).
    18. Daniel F. McCaffrey & J. R. Lockwood & Claude M. Setodji, 2013. "Inverse probability weighting with error-prone covariates," Biometrika, Biometrika Trust, vol. 100(3), pages 671-680.
    19. J. R. Lockwood & Daniel F. McCaffrey, 2016. "Matching and Weighting With Functions of Error-Prone Covariates for Causal Inference," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 111(516), pages 1831-1839, October.
    20. Charles T. Clotfelter & Helen F. Ladd & Jacob L. Vigdor, 2006. "Teacher-Student Matching and the Assessment of Teacher Effectiveness," Journal of Human Resources, University of Wisconsin Press, vol. 41(4).
    21. Manabu Kuroki & Judea Pearl, 2014. "Measurement bias and effect restoration in causal inference," Biometrika, Biometrika Trust, vol. 101(2), pages 423-437.
    22. Grace Y. Yi & Yanyuan Ma & Raymond J. Carroll, 2012. "A functional generalized method of moments approach for longitudinal studies with missing responses and covariate measurement error," Biometrika, Biometrika Trust, vol. 99(1), pages 151-165.
    23. Matthew T. Johnson & Stephen Lipscomb & Brian Gill, 2015. "Sensitivity of Teacher Value-Added Estimates to Student and Peer Control Variables (Journal Article)," Mathematica Policy Research Reports 4a9776a57ae9477e80df47e7d, Mathematica Policy Research.
    24. J. R. Lockwood & Daniel F. McCaffrey, 2019. "Impact Evaluation Using Analysis of Covariance With Error-Prone Covariates That Violate Surrogacy," Evaluation Review, , vol. 43(6), pages 335-369, December.
    25. Dan Goldhaber & Michael Hansen, 2010. "Using Performance on the Job to Inform Teacher Tenure Decisions," American Economic Review, American Economic Association, vol. 100(2), pages 250-255, May.
    26. John P. Papay & Richard J. Murnane & John B. Willett, 2016. "The Impact of Test Score Labels on Human-Capital Investment Decisions," Journal of Human Resources, University of Wisconsin Press, vol. 51(2), pages 357-388.
    27. Thomas Warm, 1989. "Weighted likelihood estimation of ability in item response theory," Psychometrika, Springer;The Psychometric Society, vol. 54(3), pages 427-450, September.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. J. R. Lockwood & Daniel F. McCaffrey, 2019. "Impact Evaluation Using Analysis of Covariance With Error-Prone Covariates That Violate Surrogacy," Evaluation Review, , vol. 43(6), pages 335-369, December.
    2. Trang Quynh Nguyen & Elizabeth A. Stuart, 2020. "Propensity Score Analysis With Latent Covariates: Measurement Error Bias Correction Using the Covariate’s Posterior Mean, aka the Inclusive Factor Score," Journal of Educational and Behavioral Statistics, , vol. 45(5), pages 598-636, October.
    3. Koedel Cory & Leatherman Rebecca & Parsons Eric, 2012. "Test Measurement Error and Inference from Value-Added Models," The B.E. Journal of Economic Analysis & Policy, De Gruyter, vol. 12(1), pages 1-37, November.
    4. Canales, Andrea & Maldonado, Luis, 2018. "Teacher quality and student achievement in Chile: Linking teachers' contribution and observable characteristics," International Journal of Educational Development, Elsevier, vol. 60(C), pages 33-50.
    5. Koedel, Cory & Mihaly, Kata & Rockoff, Jonah E., 2015. "Value-added modeling: A review," Economics of Education Review, Elsevier, vol. 47(C), pages 180-195.
    6. J. R. Lockwood & Daniel F. McCaffrey, 2017. "Simulation-Extrapolation with Latent Heteroskedastic Error Variance," Psychometrika, Springer;The Psychometric Society, vol. 82(3), pages 717-736, September.
    7. Eric Parsons & Cory Koedel & Li Tan, 2019. "Accounting for Student Disadvantage in Value-Added Models," Journal of Educational and Behavioral Statistics, , vol. 44(2), pages 144-179, April.
    8. Hermann, Zoltán & Horváth, Hedvig, 2022. "Tanári eredményesség és tanár-diák összepárosítás az általános iskolákban. Empirikus mintázatok három magyarországi tankerület adatai alapján [Teacher effectiveness and teacher-student matching in ," Közgazdasági Szemle (Economic Review - monthly of the Hungarian Academy of Sciences), Közgazdasági Szemle Alapítvány (Economic Review Foundation), vol. 0(11), pages 1377-1406.
    9. J. R. Lockwood & Daniel F. McCaffrey, 2014. "Correcting for Test Score Measurement Error in ANCOVA Models for Estimating Treatment Effects," Journal of Educational and Behavioral Statistics, , vol. 39(1), pages 22-52, February.
    10. Marie-Ann Sengewald & Steffi Pohl, 2019. "Compensation and Amplification of Attenuation Bias in Causal Effect Estimates," Psychometrika, Springer;The Psychometric Society, vol. 84(2), pages 589-610, June.
    11. Roberto V. Penaloza & Mark Berends, 2022. "The Mechanics of Treatment-effect Estimate Bias for Nonexperimental Data," Sociological Methods & Research, , vol. 51(1), pages 165-202, February.
    12. Pope, Nolan G., 2019. "The effect of teacher ratings on teacher performance," Journal of Public Economics, Elsevier, vol. 172(C), pages 84-110.
    13. Azam, Mehtabul & Kingdon, Geeta Gandhi, 2015. "Assessing teacher quality in India," Journal of Development Economics, Elsevier, vol. 117(C), pages 74-83.
    14. Hanushek, Eric A. & Rivkin, Steven G. & Schiman, Jeffrey C., 2016. "Dynamic effects of teacher turnover on the quality of instruction," Economics of Education Review, Elsevier, vol. 55(C), pages 132-148.
    15. Figlio, D. & Karbownik, K. & Salvanes, K.G., 2016. "Education Research and Administrative Data," Handbook of the Economics of Education,, Elsevier.
    16. Cory Koedel & Eric Parsons & Michael Podgursky & Mark Ehlert, 2015. "Teacher Preparation Programs and Teacher Quality: Are There Real Differences Across Programs?," Education Finance and Policy, MIT Press, vol. 10(4), pages 508-534, October.
    17. Horoi, Irina & Ost, Ben, 2015. "Disruptive peers and the estimation of teacher value added," Economics of Education Review, Elsevier, vol. 49(C), pages 180-192.
    18. Hinrichs, Peter, 2021. "What kind of teachers are schools looking for? Evidence from a randomized field experiment," Journal of Economic Behavior & Organization, Elsevier, vol. 186(C), pages 395-411.
    19. Godstime Osekhebhen Eigbiremolen, 2020. "Estimating Private School Premium for Primary School Children in Ethiopia: Evidence from Individual-level Panel Data," Progress in Development Studies, , vol. 20(1), pages 26-44, January.
    20. Nirav Mehta, 2014. "Targeting the Wrong Teachers: Estimating Teacher Quality for Use in Accountability Regimes," University of Western Ontario, Centre for Human Capital and Productivity (CHCP) Working Papers 20143, University of Western Ontario, Centre for Human Capital and Productivity (CHCP).

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:bla:jorssa:v:183:y:2020:i:4:p:1333-1362. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Wiley Content Delivery (email available below). General contact details of provider: https://edirc.repec.org/data/rssssea.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.