IDEAS home Printed from https://ideas.repec.org/p/iae/iaewps/wp2013n07.html
   My bibliography  Save this paper

Comparing Least-Squares Value-Added Analysis and Student Growth Percentile Analysis for Evaluating Student Progress and Estimating School Effects

Author

Listed:
  • Brendan Houng

    (Melbourne Institute of Applied Economic and Social Research, The University of Melbourne)

  • Moshe Justman

    (Melbourne Institute of Applied Economic and Social Research, The University of Melbourne; and Department of Economics, Ben Gurion University, Israel)

Abstract

This paper compares two functionally different approaches to analyzing standardized test data: least-squares based value-added analysis, geared principally to supporting teacher and school accountability; and Betebenner’s (2009) student growth percentiles, which focuses primarily on tracking individual student progress in a normative context and projecting probable trajectories of future performance. Applying the two methods to Australian standardized numeracy and reading test scores (NAPLAN) in grades 3 to 5 and 7 to 9, we find that although they are used differently, the two methods share key structural elements, and produce similar quantitative indicators of both individual student progress and estimated school effects.

Suggested Citation

  • Brendan Houng & Moshe Justman, 2013. "Comparing Least-Squares Value-Added Analysis and Student Growth Percentile Analysis for Evaluating Student Progress and Estimating School Effects," Melbourne Institute Working Paper Series wp2013n07, Melbourne Institute of Applied Economic and Social Research, The University of Melbourne.
  • Handle: RePEc:iae:iaewps:wp2013n07
    as

    Download full text from publisher

    File URL: http://melbourneinstitute.unimelb.edu.au/downloads/working_paper_series/wp2013n07.pdf
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Lorraine Dearden & John Micklewright & Anna Vignoles, 2011. "The Effectiveness of English Secondary Schools for Pupils of Different Ability Levels," Fiscal Studies, Institute for Fiscal Studies, vol. 32(2), pages 225-244, June.
    2. Timothy N. Bond & Kevin Lang, 2013. "The Evolution of the Black-White Test Score Gap in Grades K–3: The Fragility of Results," The Review of Economics and Statistics, MIT Press, vol. 95(5), pages 1468-1479, December.
    3. Louis T. Mariano & Daniel F. McCaffrey & J. R. Lockwood, 2010. "A Model for Teacher Effects From Longitudinal Data Without Assuming Vertical Scaling," Journal of Educational and Behavioral Statistics, , vol. 35(3), pages 253-279, June.
    4. Dale Ballou & William Sanders & Paul Wright, 2004. "Controlling for Student Background in Value-Added Assessment of Teachers," Journal of Educational and Behavioral Statistics, , vol. 29(1), pages 37-65, March.
    5. Dale Ballou, 2009. "Test Scaling and Value-Added Measurement," Education Finance and Policy, MIT Press, vol. 4(4), pages 351-383, October.
    6. Rebecca Allen & Simon Burgess, 2011. "Can School League Tables Help Parents Choose Schools?," Fiscal Studies, Institute for Fiscal Studies, vol. 32(2), pages 245-261, June.
    7. Andrew Ray & Tanya McCormack & Helen Evans, 2009. "Value Added in English Schools," Education Finance and Policy, MIT Press, vol. 4(4), pages 415-438, October.
    8. Cory Koedel & Mark Ehlert & Eric Parsons & Michael Podgursky, 2012. "Selecting Growth Measures for School and Teacher Evaluations," Working Papers 1210, Department of Economics, University of Missouri.
    9. Daniel F. McCaffrey & J. R. Lockwood & Daniel Koretz & Thomas A. Louis & Laura Hamilton, 2004. "Models for Value-Added Modeling of Teacher Effects," Journal of Educational and Behavioral Statistics, , vol. 29(1), pages 67-101, March.
    10. Harvey Goldstein & David J. Spiegelhalter, 1996. "League Tables and Their Limitations: Statistical Issues in Comparisons of Institutional Performance," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 159(3), pages 385-409, May.
    11. Sean F. Reardon & Stephen W. Raudenbush, 2009. "Assumptions of Value-Added Models for Estimating School Effects," Education Finance and Policy, MIT Press, vol. 4(4), pages 492-519, October.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Katherine E. Castellano & Andrew D. Ho, 2015. "Practical Differences Among Aggregate-Level Conditional Status Metrics," Journal of Educational and Behavioral Statistics, , vol. 40(1), pages 35-68, February.
    2. Brendan Houng & Moshe Justman, 2015. "Out-Of-Sample Predictions Of Access To Higher Education And School Value-Added," Working Papers 1511, Ben-Gurion University of the Negev, Department of Economics.
    3. Kevin Pugh & Gigi Foster, 2014. "Australia's National School Data and the ‘Big Data’ Revolution in Education Economics," Australian Economic Review, The University of Melbourne, Melbourne Institute of Applied Economic and Social Research, vol. 47(2), pages 258-268, June.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Moshe Justman & Brendan Houng, 2013. "A Comparison Of Two Methods For Estimating School Effects And Tracking Student Progress From Standardized Test Scores," Working Papers 1316, Ben-Gurion University of the Negev, Department of Economics.
    2. Vosters, Kelly N. & Guarino, Cassandra M. & Wooldridge, Jeffrey M., 2018. "Understanding and evaluating the SAS® EVAAS® Univariate Response Model (URM) for measuring teacher effectiveness," Economics of Education Review, Elsevier, vol. 66(C), pages 191-205.
    3. Garritt L. Page & Ernesto San Martín & Javiera Orellana & Jorge González, 2017. "Exploring complete school effectiveness via quantile value added," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 180(1), pages 315-340, January.
    4. Stacy, Brian & Guarino, Cassandra & Wooldridge, Jeffrey, 2018. "Does the precision and stability of value-added estimates of teacher performance depend on the types of students they serve?," Economics of Education Review, Elsevier, vol. 64(C), pages 50-74.
    5. Derek C. Briggs & Ben Domingue, 2013. "The Gains From Vertical Scaling," Journal of Educational and Behavioral Statistics, , vol. 38(6), pages 551-576, December.
    6. Katherine E. Castellano & Andrew D. Ho, 2015. "Practical Differences Among Aggregate-Level Conditional Status Metrics," Journal of Educational and Behavioral Statistics, , vol. 40(1), pages 35-68, February.
    7. Lucy Prior & John Jerrim & Dave Thomson & George Leckie, 2021. "A review and evaluation of secondary school accountability in England: Statistical strengths, weaknesses, and challenges for 'Progress 8' raised by COVID-19," CEPEO Working Paper Series 21-04, UCL Centre for Education Policy and Equalising Opportunities, revised Apr 2021.
    8. Dan Goldhaber & Michael Hansen, 2013. "Is it Just a Bad Class? Assessing the Long-term Stability of Estimated Teacher Performance," Economica, London School of Economics and Political Science, vol. 80(319), pages 589-612, July.
    9. Cassandra M. Guarino & Mark D. Reckase & Jeffrey M. Woolrdige, 2014. "Can Value-Added Measures of Teacher Performance Be Trusted?," Education Finance and Policy, MIT Press, vol. 10(1), pages 117-156, November.
    10. Susanna Loeb & Michael S. Christian & Heather Hough & Robert H. Meyer & Andrew B. Rice & Martin R. West, 2019. "School Differences in Social–Emotional Learning Gains: Findings From the First Large-Scale Panel Survey of Students," Journal of Educational and Behavioral Statistics, , vol. 44(5), pages 507-542, October.
    11. Lucy Prior & John Jerrim & Dave Thomson & George Leckie, 2021. "A review and evaluation of secondary school accountability in England: Statistical strengths, weaknesses, and challenges for ‘Progress 8’ raised by COVID-19," DoQSS Working Papers 21-12, Quantitative Social Science - UCL Social Research Institute, University College London.
    12. David M. Quinn & Andrew D. Ho, 2021. "Ordinal Approaches to Decomposing Between-Group Test Score Disparities," Journal of Educational and Behavioral Statistics, , vol. 46(4), pages 466-500, August.
    13. Cory Koedel & Mark Ehlert & Eric Parsons & Michael Podgursky, 2012. "Selecting Growth Measures for School and Teacher Evaluations," Working Papers 1210, Department of Economics, University of Missouri.
    14. Aedin Doris & Donal O'Neill & Olive Sweetman, 2019. "Good Schools or Good Students? The Importance of Selectivity for School Rankings," Economics Department Working Paper Series n293-19.pdf, Department of Economics, National University of Ireland - Maynooth.
    15. Andrew McEachin & Allison Atteberry, 2017. "The Impact of Summer Learning Loss on Measures of School Performance," Education Finance and Policy, MIT Press, vol. 12(4), pages 468-491, Fall.
    16. Elizabeth U. Cascio & Douglas O. Staiger, 2012. "Knowledge, Tests, and Fadeout in Educational Interventions," NBER Working Papers 18038, National Bureau of Economic Research, Inc.
    17. Jorge Manzi & Ernesto San Martín & Sébastien Van Bellegem, 2014. "School System Evaluation by Value Added Analysis Under Endogeneity," Psychometrika, Springer;The Psychometric Society, vol. 79(1), pages 130-153, January.
    18. Heikki Pursiainen & Mika Kortelainen & Jenni Pääkkönen, 2014. "Impact of School Quality on Educational Attainment - Evidence from Finnish High Schools," ERSA conference papers ersa14p711, European Regional Science Association.
    19. Josh Kinsler, 2016. "Teacher Complementarities in Test Score Production: Evidence from Primary School," Journal of Labor Economics, University of Chicago Press, vol. 34(1), pages 29-61.
    20. Paul Hewson & Keming Yu, 2008. "Quantile regression for binary performance indicators," Applied Stochastic Models in Business and Industry, John Wiley & Sons, vol. 24(5), pages 401-418, September.

    More about this item

    Keywords

    Value-added analysis; student growth percentiles; NAPLAN;
    All these keywords.

    JEL classification:

    • I21 - Health, Education, and Welfare - - Education - - - Analysis of Education
    • I28 - Health, Education, and Welfare - - Education - - - Government Policy

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:iae:iaewps:wp2013n07. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sheri Carnegie (email available below). General contact details of provider: https://edirc.repec.org/data/mimelau.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.