IDEAS home Printed from https://ideas.repec.org/a/taf/edecon/v14y2006i2p235-250.html
   My bibliography  Save this article

Student Assessments, Non-test-takers, and School Accountability

Author

Listed:
  • Robert Lemke
  • Claus Hoerandner
  • Robert McMahon

Abstract

Much attention has focused recently on using student test scores to evaluate public schools. The No Child Left Behind Act of 2002 requires states to test students and evaluate each school's progress toward having all students meet or exceed state standards. Under the law, however, schools only need to test 95% of their students. When some students do not take the test, variability arises in a school's evaluation as the score of each student who did not take the test remains unknown. Using a statewide assessment administered to 11th graders in Illinois, we investigate this source of variation. In our data, 8% of students do not take the test. By applying a bounding technique to the unknown scores of the non-test-takers, we show that classifying schools as failing or passing against some fixed threshold frequently can be misleading. We also provide evidence that some schools may be strategically selecting some students to not take the test and, by so doing, increasing the school's test scores.

Suggested Citation

  • Robert Lemke & Claus Hoerandner & Robert McMahon, 2006. "Student Assessments, Non-test-takers, and School Accountability," Education Economics, Taylor & Francis Journals, vol. 14(2), pages 235-250.
  • Handle: RePEc:taf:edecon:v:14:y:2006:i:2:p:235-250
    DOI: 10.1080/09645290600622970
    as

    Download full text from publisher

    File URL: http://www.tandfonline.com/10.1080/09645290600622970
    Download Restriction: Access to full text is restricted to subscribers.

    File URL: https://libkey.io/10.1080/09645290600622970?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. John V. Pepper, 1999. "What Do Welfare-to-Work Demonstrations Reveal to Welfare Reformers?," JCPR Working Papers 105, Northwestern University/University of Chicago Joint Center for Poverty Research.
    2. Manski, Charles F., 1992. "Identification Problems In The Social Sciences," SSRI Workshop Series 292716, University of Wisconsin-Madison, Social Systems Research Institute.
    3. Thomas J. Kane & Douglas O. Staiger, 2002. "The Promise and Pitfalls of Using Imprecise School Accountability Measures," Journal of Economic Perspectives, American Economic Association, vol. 16(4), pages 91-114, Fall.
    4. John V. Pepper, 2003. "Using Experiments to Evaluate Performance Standards: What Do Welfare-to-Work Demonstrations Reveal to Welfare Reformers?," Journal of Human Resources, University of Wisconsin Press, vol. 38(4).
    5. Caroline M. Hoxby, 2000. "The Effects of Class Size on Student Achievement: New Evidence from Population Variation," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 115(4), pages 1239-1285.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Cuesta, José Ignacio & González, Felipe & Larroulet Philippi, Cristian, 2020. "Distorted quality signals in school markets," Journal of Development Economics, Elsevier, vol. 147(C).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Claus M. Hoerandner & Robert J. Lemke, 2006. "Can No Child Left Behind Close The Gaps In Pass Rates On Standardized Tests?," Contemporary Economic Policy, Western Economic Association International, vol. 24(1), pages 1-17, January.
    2. Oscar Mitnik, 2008. "How do Training Programs Assign Participants to Training? Characterizing the Assignment Rules of Government Agencies for Welfare-to-Work Programs in California," Working Papers 0907, University of Miami, Department of Economics.
    3. Charles F. Manski & John Newman & John V. Pepper, "undated". "Using Performance Standards to Evaluate Social Programs with Incomplete Outcome Data: General Issues and Application to a Higher Education Block Grant Program," IPR working papers 00-1, Institute for Policy Resarch at Northwestern University.
    4. Carrie Conaway & Dan Goldhaber, 2020. "Appropriate Standards of Evidence for Education Policy Decision Making," Education Finance and Policy, MIT Press, vol. 15(2), pages 383-396, Spring.
    5. Evan Riehl & Meredith Welch, 2023. "Accountability, Test Prep Incentives, and the Design of Math and English Exams," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 42(1), pages 60-96, January.
    6. Guildo W. Imbens, 2003. "Sensitivity to Exogeneity Assumptions in Program Evaluation," American Economic Review, American Economic Association, vol. 93(2), pages 126-132, May.
    7. Ma, Lingjie & Koenker, Roger, 2006. "Quantile regression methods for recursive structural equation models," Journal of Econometrics, Elsevier, vol. 134(2), pages 471-506, October.
    8. Tahir Andrabi & Jishnu Das & Asim Ijaz Khwaja, 2017. "Report Cards: The Impact of Providing School and Child Test Scores on Educational Markets," American Economic Review, American Economic Association, vol. 107(6), pages 1535-1563, June.
    9. Imbens, Guido W. & Pizer, William A., 2000. "The Analysis of Randomized Experiments with Missing Data," Discussion Papers 10596, Resources for the Future.
    10. Giacomo De Giorgi & Michele Pellizzari & William Gui Woolston, 2012. "Class Size And Class Heterogeneity," Journal of the European Economic Association, European Economic Association, vol. 10(4), pages 795-830, August.
    11. Battaglia, Marianna & Lebedinski, Lara, 2015. "Equal Access to Education: An Evaluation of the Roma Teaching Assistant Program in Serbia," World Development, Elsevier, vol. 76(C), pages 62-81.
    12. Stephan Litschig, 2008. "Financing local development: Quasi-experimental evidence from municipalities in Brazil, 1980-1991," Economics Working Papers 1142, Department of Economics and Business, Universitat Pompeu Fabra, revised Jun 2012.
    13. Berthélemy Michel & Bonev Petyo & Dussaux Damien & Söderberg Magnus, 2019. "Methods for strengthening a weak instrument in the case of a persistent treatment," Studies in Nonlinear Dynamics & Econometrics, De Gruyter, vol. 23(1), pages 1-30, February.
    14. Martin Schlotter & Guido Schwerdt & Ludger Woessmann, 2011. "Econometric methods for causal evaluation of education policies and practices: a non-technical guide," Education Economics, Taylor & Francis Journals, vol. 19(2), pages 109-137.
    15. Jacob M. Markman & Eric A. Hanushek & John F. Kain & Steven G. Rivkin, 2003. "Does peer ability affect student achievement?," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 18(5), pages 527-544.
    16. Foreman-Peck, James & Foreman-Peck, Lorraine, 2006. "Should schools be smaller? The size-performance relationship for Welsh schools," Economics of Education Review, Elsevier, vol. 25(2), pages 157-171, April.
    17. Koedel Cory & Leatherman Rebecca & Parsons Eric, 2012. "Test Measurement Error and Inference from Value-Added Models," The B.E. Journal of Economic Analysis & Policy, De Gruyter, vol. 12(1), pages 1-37, November.
    18. Sascha O. Becker & Marco Caliendo, 2007. "Sensitivity analysis for average treatment effects," Stata Journal, StataCorp LP, vol. 7(1), pages 71-83, February.
    19. Thushyanthan Baskaran & Zohal Hessami, 2017. "Political alignment and intergovernmental transfers in parliamentary systems: evidence from Germany," Public Choice, Springer, vol. 171(1), pages 75-98, April.
    20. Michael Bates & Michael Dinerstein & Andrew C. Johnston & Isaac Sorkin, 2022. "Teacher Labor Market Equilibrium and Student Achievement," CESifo Working Paper Series 9551, CESifo.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:taf:edecon:v:14:y:2006:i:2:p:235-250. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Chris Longhurst (email available below). General contact details of provider: http://www.tandfonline.com/CEDE20 .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.