IDEAS home Printed from https://ideas.repec.org/a/aea/jecper/v24y2010i3p167-82.html
   My bibliography  Save this article

Measurement Matters: Perspectives on Education Policy from an Economist and School Board Member

Author

Listed:
  • Kevin Lang

Abstract

One of the potential strengths of the No Child Left Behind (NCLB) Act enacted in 2002 is that the law requires the production of an enormous amount of data, particularly from tests, which, if used properly, might help us improve education. As an economist and as someone who served 13 years on the School Committee in Brookline Massachusetts, until May 2009, I have been appalled by the limited ability of districts to analyze these data; I have been equally appalled by the cavalier manner in which economists use test scores and related measures in their analyses. The summary data currently provided are very hard to interpret, and policymakers, who typically lack statistical sophistication, cannot easily use them to assess progress. In some domains, most notably the use of average test scores to evaluate teachers or schools, the education community is aware of the biases and has sought better measures. The economics and statistics communities have both responded to and created this demand by developing value-added measures that carry a scientific aura. However, economists have largely failed to recognize many of the problems with such measures. These problems are sufficiently important that they should preclude any automatic link between these measures and rewards or sanctions. They do, however, contain information and can be used as a catalyst for more careful evaluation of teachers and schools, and as a lever to induce principals and other administrators to act on their knowledge.

Suggested Citation

  • Kevin Lang, 2010. "Measurement Matters: Perspectives on Education Policy from an Economist and School Board Member," Journal of Economic Perspectives, American Economic Association, vol. 24(3), pages 167-182, Summer.
  • Handle: RePEc:aea:jecper:v:24:y:2010:i:3:p:167-82
    Note: DOI: 10.1257/jep.24.3.167
    as

    Download full text from publisher

    File URL: http://www.aeaweb.org/articles.php?doi=10.1257/jep.24.3.167
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Kevin Lang, 2007. "Introduction to Poverty and Discrimination," Introductory Chapters, in: Poverty and Discrimination, Princeton University Press.
    2. Thomas J. Kane & Douglas O. Staiger, 2008. "Estimating Teacher Impacts on Student Achievement: An Experimental Evaluation," NBER Working Papers 14607, National Bureau of Economic Research, Inc.
    3. Thomas J. Kane & Douglas O. Staiger, 2002. "The Promise and Pitfalls of Using Imprecise School Accountability Measures," Journal of Economic Perspectives, American Economic Association, vol. 16(4), pages 91-114, Fall.
    4. J. R. Lockwood & Daniel F. McCaffrey & Louis T. Mariano & Claude Setodji, 2007. "Bayesian Methods for Scalable Multivariate Value-Added Assessment," Journal of Educational and Behavioral Statistics, , vol. 32(2), pages 125-150, June.
    5. Brian A. Jacob & Lars Lefgren, 2004. "Remedial Education and Student Achievement: A Regression-Discontinuity Analysis," The Review of Economics and Statistics, MIT Press, vol. 86(1), pages 226-244, February.
    6. Brian A. Jacob & Lars Lefgren, 2008. "Can Principals Identify Effective Teachers? Evidence on Subjective Performance Evaluation in Education," Journal of Labor Economics, University of Chicago Press, vol. 26(1), pages 101-136.
    7. Daniel F. McCaffrey & J. R. Lockwood & Daniel Koretz & Thomas A. Louis & Laura Hamilton, 2004. "Models for Value-Added Modeling of Teacher Effects," Journal of Educational and Behavioral Statistics, , vol. 29(1), pages 67-101, March.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Elizabeth U. Cascio & Douglas O. Staiger, 2012. "Knowledge, Tests, and Fadeout in Educational Interventions," NBER Working Papers 18038, National Bureau of Economic Research, Inc.
    2. Schwerdt, Guido & West, Martin R. & Winters, Marcus A., 2017. "The effects of test-based retention on student outcomes over time: Regression discontinuity evidence from Florida," Journal of Public Economics, Elsevier, vol. 152(C), pages 154-169.
    3. Timothy N. Bond & Kevin Lang, 2013. "The Evolution of the Black-White Test Score Gap in Grades K–3: The Fragility of Results," The Review of Economics and Statistics, MIT Press, vol. 95(5), pages 1468-1479, December.
    4. Jeffrey R. Bloem & Andrew J. Oswald, 2022. "The Analysis of Human Feelings: A Practical Suggestion for a Robustness Test," Review of Income and Wealth, International Association for Research in Income and Wealth, vol. 68(3), pages 689-710, September.
    5. Diego Azqueta Oyarzun & Guillermina Gavaldon, 2014. "The economic assessment of education: Social Efficiency or Social Reconstruction?," Investigaciones de Economía de la Educación volume 9, in: Adela García Aracil & Isabel Neira Gómez (ed.), Investigaciones de Economía de la Educación 9, edition 1, volume 9, chapter 51, pages 969-978, Asociación de Economía de la Educación.
    6. Nirav Mehta, 2019. "Measuring quality for use in incentive schemes: The case of “shrinkage” estimators," Quantitative Economics, Econometric Society, vol. 10(4), pages 1537-1577, November.
    7. Forbes Silke & Gordon Nora, 2012. "When Educators Are the Learners: Private Contracting by Public Schools," The B.E. Journal of Economic Analysis & Policy, De Gruyter, vol. 12(1), pages 1-29, July.
    8. Mariesa A. Herrmann & Jonah E. Rockoff, 2012. "Worker Absence and Productivity: Evidence from Teaching," Journal of Labor Economics, University of Chicago Press, vol. 30(4), pages 749-782.
    9. Daniel M. Bolt & Xiangyi Liao, 2022. "Item Complexity: A Neglected Psychometric Feature of Test Items?," Psychometrika, Springer;The Psychometric Society, vol. 87(4), pages 1195-1213, December.
    10. Sonia Bhalotra & Martin Karlsson & Therese Nilsson & Nina Schwarz, 2022. "Infant Health, Cognitive Performance, and Earnings: Evidence from Inception of the Welfare State in Sweden," The Review of Economics and Statistics, MIT Press, vol. 104(6), pages 1138-1156, November.
    11. Wan, Sirui & Bond, Timothy N. & Lang, Kevin & Clements, Douglas H. & Sarama, Julie & Bailey, Drew H., 2021. "Is intervention fadeout a scaling artefact?," Economics of Education Review, Elsevier, vol. 82(C).
    12. Joshua B. Gilbert & Zachary Himmelsbach & James Soland & Mridul Joshi & Benjamin W. Domingue, 2024. "Estimating Heterogeneous Treatment Effects with Item-Level Outcome Data: Insights from Item Response Theory," Papers 2405.00161, arXiv.org, revised Aug 2024.
    13. Carsten Schroeder & Shlomo Yitzhaki, 2020. "Exploring the robustness of country rankings by educational attainment," Journal of Economics, Springer, vol. 129(3), pages 271-296, April.
    14. Eric R. Nielsen, 2015. "Achievement Gap Estimates and Deviations from Cardinal Comparability," Finance and Economics Discussion Series 2015-40, Board of Governors of the Federal Reserve System (U.S.).
    15. Eric R. Nielsen, 2015. "The Income-Achievement Gap and Adult Outcome Inequality," Finance and Economics Discussion Series 2015-41, Board of Governors of the Federal Reserve System (U.S.).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Brian A. Jacob & Lars Lefgren & David Sims, 2008. "The Persistence of Teacher-Induced Learning Gains," NBER Working Papers 14065, National Bureau of Economic Research, Inc.
    2. Stacy, Brian & Guarino, Cassandra & Wooldridge, Jeffrey, 2018. "Does the precision and stability of value-added estimates of teacher performance depend on the types of students they serve?," Economics of Education Review, Elsevier, vol. 64(C), pages 50-74.
    3. Douglas O. Staiger & Jonah E. Rockoff, 2010. "Searching for Effective Teachers with Imperfect Information," Journal of Economic Perspectives, American Economic Association, vol. 24(3), pages 97-118, Summer.
    4. Allison Atteberry & Susanna Loeb & James Wyckoff, 2013. "Do First Impressions Matter? Improvement in Early Career Teacher Effectiveness," NBER Working Papers 19096, National Bureau of Economic Research, Inc.
    5. repec:mpr:mprres:8135 is not listed on IDEAS
    6. Koedel, Cory & Mihaly, Kata & Rockoff, Jonah E., 2015. "Value-added modeling: A review," Economics of Education Review, Elsevier, vol. 47(C), pages 180-195.
    7. Azam, Mehtabul & Kingdon, Geeta Gandhi, 2015. "Assessing teacher quality in India," Journal of Development Economics, Elsevier, vol. 117(C), pages 74-83.
    8. Dan Goldhaber & Michael Hansen, 2013. "Is it Just a Bad Class? Assessing the Long-term Stability of Estimated Teacher Performance," Economica, London School of Economics and Political Science, vol. 80(319), pages 589-612, July.
    9. Guarino, Cassandra M. & Reckase, Mark D. & Stacy, Brian & Wooldridge, Jeffrey M., 2014. "A Comparison of Growth Percentile and Value-Added Models of Teacher Performance," IZA Discussion Papers 7973, Institute of Labor Economics (IZA).
    10. Eric Parsons & Cory Koedel & Li Tan, 2019. "Accounting for Student Disadvantage in Value-Added Models," Journal of Educational and Behavioral Statistics, , vol. 44(2), pages 144-179, April.
    11. Eric Isenberg & Heinrich Hock, 2010. "Measuring School and Teacher Value Added for IMPACT and TEAM in DC Public Schools," Mathematica Policy Research Reports 42dd1ff2d7eb46948f98d8e9c, Mathematica Policy Research.
    12. Peter Z. Schochet & Hanley S. Chiang, 2013. "What Are Error Rates for Classifying Teacher and School Performance Using Value-Added Models?," Journal of Educational and Behavioral Statistics, , vol. 38(2), pages 142-171, April.
    13. Liz Potamites & Kevin Booker & Duncan Chaplin & Eric Isenberg, "undated". "Measuring School and Teacher Effectiveness in the EPIC Charter School Consortium—Year 2," Mathematica Policy Research Reports 15116b94f80d4636a5d9c3d75, Mathematica Policy Research.
    14. Matthew A. Kraft, 2015. "Teacher Layoffs, Teacher Quality, and Student Achievement: Evidence from a Discretionary Layoff Policy," Education Finance and Policy, MIT Press, vol. 10(4), pages 467-507, October.
    15. Evan Riehl & Meredith Welch, 2023. "Accountability, Test Prep Incentives, and the Design of Math and English Exams," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 42(1), pages 60-96, January.
    16. Liz Potamites & Duncan Chaplin & Eric Isenberg & Kevin Booker, "undated". "Measuring School Effectiveness in Memphis—Year 2," Mathematica Policy Research Reports 9bdcadc715c24780b09c11e9e, Mathematica Policy Research.
    17. repec:mpr:mprres:6418 is not listed on IDEAS
    18. Eric Isenberg & Heinrich Hock, "undated". "Measuring School and Teacher Value Added in DC, 2011-2012 School Year," Mathematica Policy Research Reports 429d075abf05480eb0fbb4831, Mathematica Policy Research.
    19. repec:mpr:mprres:6417 is not listed on IDEAS
    20. Heikki Pursiainen & Mika Kortelainen & Jenni Pääkkönen, 2014. "Impact of School Quality on Educational Attainment - Evidence from Finnish High Schools," ERSA conference papers ersa14p711, European Regional Science Association.
    21. repec:mpr:mprres:7817 is not listed on IDEAS
    22. Joan Martinez, 2022. "The Long-Term Effects of Teachers' Gender Stereotypes," Papers 2212.08220, arXiv.org, revised Jul 2023.
    23. repec:mpr:mprres:7019 is not listed on IDEAS
    24. Vosters, Kelly N. & Guarino, Cassandra M. & Wooldridge, Jeffrey M., 2018. "Understanding and evaluating the SAS® EVAAS® Univariate Response Model (URM) for measuring teacher effectiveness," Economics of Education Review, Elsevier, vol. 66(C), pages 191-205.
    25. Elias Walsh & Stephen Lipscomb, "undated". "Classroom Observations from Phase 2 of the Pennsylvania Teacher Evaluation Pilot: Assessing Internal Consistency, Score Variation, and Relationships with Value Added," Mathematica Policy Research Reports a6b29a4a217f42a09d5206cfe, Mathematica Policy Research.

    More about this item

    JEL classification:

    • H52 - Public Economics - - National Government Expenditures and Related Policies - - - Government Expenditures and Education
    • H75 - Public Economics - - State and Local Government; Intergovernmental Relations - - - State and Local Government: Health, Education, and Welfare
    • I21 - Health, Education, and Welfare - - Education - - - Analysis of Education
    • I28 - Health, Education, and Welfare - - Education - - - Government Policy

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:aea:jecper:v:24:y:2010:i:3:p:167-82. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Michael P. Albert (email available below). General contact details of provider: https://edirc.repec.org/data/aeaaaea.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.