IDEAS home Printed from https://ideas.repec.org/p/bgu/wpaper/1316.html
   My bibliography  Save this paper

A Comparison Of Two Methods For Estimating School Effects And Tracking Student Progress From Standardized Test Scores

Author

Listed:
  • Moshe Justman

    (BGU)

  • Brendan Houng

    (Melbourne Institute of Applied Economic and Social Research, University of Melbourne)

Abstract

This paper compares two leading approaches to analyzing standardized test data: leastsquares value-added analysis, used mainly to support accountability by identifying teacher and school effects; and Betebenner’s (2009) student growth percentiles method, which focuses on normative tracking of individual student progress. Applying both methods to analyze two-year progress in numeracy and reading in elementary and middle school, as reflected in Australian standardized test scores, we find that they produce similar quantitative indicators of both individual student progress and estimated school effects. This suggests that with minor modifications either methodology could be used for both purposes.

Suggested Citation

  • Moshe Justman & Brendan Houng, 2013. "A Comparison Of Two Methods For Estimating School Effects And Tracking Student Progress From Standardized Test Scores," Working Papers 1316, Ben-Gurion University of the Negev, Department of Economics.
  • Handle: RePEc:bgu:wpaper:1316
    as

    Download full text from publisher

    File URL: http://in.bgu.ac.il/en/humsos/Econ/Workingpapers/1316.pdf
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Lorraine Dearden & John Micklewright & Anna Vignoles, 2011. "The Effectiveness of English Secondary Schools for Pupils of Different Ability Levels," Fiscal Studies, Institute for Fiscal Studies, vol. 32(2), pages 225-244, June.
    2. Cory Koedel & Mark Ehlert & Eric Parsons & Michael Podgursky, 2012. "Selecting Growth Measures for School and Teacher Evaluations," Working Papers 1210, Department of Economics, University of Missouri.
    3. Timothy N. Bond & Kevin Lang, 2013. "The Evolution of the Black-White Test Score Gap in Grades K–3: The Fragility of Results," The Review of Economics and Statistics, MIT Press, vol. 95(5), pages 1468-1479, December.
    4. Rebecca Allen & Simon Burgess, 2011. "Can School League Tables Help Parents Choose Schools?," Fiscal Studies, Institute for Fiscal Studies, vol. 32(2), pages 245-261, June.
    5. Daniel F. McCaffrey & J. R. Lockwood & Daniel Koretz & Thomas A. Louis & Laura Hamilton, 2004. "Models for Value-Added Modeling of Teacher Effects," Journal of Educational and Behavioral Statistics, , vol. 29(1), pages 67-101, March.
    6. Louis T. Mariano & Daniel F. McCaffrey & J. R. Lockwood, 2010. "A Model for Teacher Effects From Longitudinal Data Without Assuming Vertical Scaling," Journal of Educational and Behavioral Statistics, , vol. 35(3), pages 253-279, June.
    7. Sean F. Reardon & Stephen W. Raudenbush, 2009. "Assumptions of Value-Added Models for Estimating School Effects," Education Finance and Policy, MIT Press, vol. 4(4), pages 492-519, October.
    8. Andrew Ray & Tanya McCormack & Helen Evans, 2009. "Value Added in English Schools," Education Finance and Policy, MIT Press, vol. 4(4), pages 415-438, October.
    9. Dale Ballou & William Sanders & Paul Wright, 2004. "Controlling for Student Background in Value-Added Assessment of Teachers," Journal of Educational and Behavioral Statistics, , vol. 29(1), pages 37-65, March.
    10. Dale Ballou, 2009. "Test Scaling and Value-Added Measurement," Education Finance and Policy, MIT Press, vol. 4(4), pages 351-383, October.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Brendan Houng & Moshe Justman, 2013. "Comparing Least-Squares Value-Added Analysis and Student Growth Percentile Analysis for Evaluating Student Progress and Estimating School Effects," Melbourne Institute Working Paper Series wp2013n07, Melbourne Institute of Applied Economic and Social Research, The University of Melbourne.
    2. Vosters, Kelly N. & Guarino, Cassandra M. & Wooldridge, Jeffrey M., 2018. "Understanding and evaluating the SAS® EVAAS® Univariate Response Model (URM) for measuring teacher effectiveness," Economics of Education Review, Elsevier, vol. 66(C), pages 191-205.
    3. Garritt L. Page & Ernesto San Martín & Javiera Orellana & Jorge González, 2017. "Exploring complete school effectiveness via quantile value added," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 180(1), pages 315-340, January.
    4. Stacy, Brian & Guarino, Cassandra & Wooldridge, Jeffrey, 2018. "Does the precision and stability of value-added estimates of teacher performance depend on the types of students they serve?," Economics of Education Review, Elsevier, vol. 64(C), pages 50-74.
    5. Derek C. Briggs & Ben Domingue, 2013. "The Gains From Vertical Scaling," Journal of Educational and Behavioral Statistics, , vol. 38(6), pages 551-576, December.
    6. Katherine E. Castellano & Andrew D. Ho, 2015. "Practical Differences Among Aggregate-Level Conditional Status Metrics," Journal of Educational and Behavioral Statistics, , vol. 40(1), pages 35-68, February.
    7. Lucy Prior & John Jerrim & Dave Thomson & George Leckie, 2021. "A review and evaluation of secondary school accountability in England: Statistical strengths, weaknesses, and challenges for 'Progress 8' raised by COVID-19," CEPEO Working Paper Series 21-04, UCL Centre for Education Policy and Equalising Opportunities, revised Apr 2021.
    8. Dan Goldhaber & Michael Hansen, 2013. "Is it Just a Bad Class? Assessing the Long-term Stability of Estimated Teacher Performance," Economica, London School of Economics and Political Science, vol. 80(319), pages 589-612, July.
    9. Cassandra M. Guarino & Mark D. Reckase & Jeffrey M. Woolrdige, 2014. "Can Value-Added Measures of Teacher Performance Be Trusted?," Education Finance and Policy, MIT Press, vol. 10(1), pages 117-156, November.
    10. Susanna Loeb & Michael S. Christian & Heather Hough & Robert H. Meyer & Andrew B. Rice & Martin R. West, 2019. "School Differences in Social–Emotional Learning Gains: Findings From the First Large-Scale Panel Survey of Students," Journal of Educational and Behavioral Statistics, , vol. 44(5), pages 507-542, October.
    11. Lucy Prior & John Jerrim & Dave Thomson & George Leckie, 2021. "A review and evaluation of secondary school accountability in England: Statistical strengths, weaknesses, and challenges for ‘Progress 8’ raised by COVID-19," DoQSS Working Papers 21-12, Quantitative Social Science - UCL Social Research Institute, University College London.
    12. David M. Quinn & Andrew D. Ho, 2021. "Ordinal Approaches to Decomposing Between-Group Test Score Disparities," Journal of Educational and Behavioral Statistics, , vol. 46(4), pages 466-500, August.
    13. Cory Koedel & Mark Ehlert & Eric Parsons & Michael Podgursky, 2012. "Selecting Growth Measures for School and Teacher Evaluations," Working Papers 1210, Department of Economics, University of Missouri.
    14. Aedin Doris & Donal O'Neill & Olive Sweetman, 2019. "Good Schools or Good Students? The Importance of Selectivity for School Rankings," Economics Department Working Paper Series n293-19.pdf, Department of Economics, National University of Ireland - Maynooth.
    15. Andrew McEachin & Allison Atteberry, 2017. "The Impact of Summer Learning Loss on Measures of School Performance," Education Finance and Policy, MIT Press, vol. 12(4), pages 468-491, Fall.
    16. Elizabeth U. Cascio & Douglas O. Staiger, 2012. "Knowledge, Tests, and Fadeout in Educational Interventions," NBER Working Papers 18038, National Bureau of Economic Research, Inc.
    17. Jorge Manzi & Ernesto San Martín & Sébastien Van Bellegem, 2014. "School System Evaluation by Value Added Analysis Under Endogeneity," Psychometrika, Springer;The Psychometric Society, vol. 79(1), pages 130-153, January.
    18. Josh Kinsler, 2016. "Teacher Complementarities in Test Score Production: Evidence from Primary School," Journal of Labor Economics, University of Chicago Press, vol. 34(1), pages 29-61.
    19. Dan Goldhaber & Roddy Theobald, 2013. "Managing the Teacher Workforce in Austere Times: The Determinants and Implications of Teacher Layoffs," Education Finance and Policy, MIT Press, vol. 8(4), pages 494-527, October.
    20. Gordey A. Yasterbov & Alexey R. Bessudnov & Marina A. Pinskaya & Sergey G. Kosaretsky, 2014. "Contextualizing Academic Performance In Russian Schools: School Characteristics, The Composition Of Student Body And Local Deprivation," HSE Working papers WP BRP 55/SOC/2014, National Research University Higher School of Economics.

    More about this item

    Keywords

    value-added analysis; student growth percentiles; NAPLAN;
    All these keywords.

    JEL classification:

    • I21 - Health, Education, and Welfare - - Education - - - Analysis of Education
    • I28 - Health, Education, and Welfare - - Education - - - Government Policy

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:bgu:wpaper:1316. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Aamer Abu-Qarn (email available below). General contact details of provider: https://edirc.repec.org/data/edbguil.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.