IDEAS home Printed from https://ideas.repec.org/a/eee/ecolet/v132y2015icp24-27.html
   My bibliography  Save this article

An axiomatization of multiple-choice test scoring

Author

Listed:
  • Zapechelnyuk, Andriy

Abstract

This note axiomatically justifies a simple scoring rule for multiple-choice tests. The rule permits choosing any number, k, of available options and grants 1/k-th of the maximum score if one of the chosen options is correct, and zero otherwise. This rule satisfies a few desirable properties: simplicity of implementation, non-negative scores, discouragement of random guessing, and rewards for partial answers. This is a novel rule that has not been discussed or empirically tested in the literature.

Suggested Citation

  • Zapechelnyuk, Andriy, 2015. "An axiomatization of multiple-choice test scoring," Economics Letters, Elsevier, vol. 132(C), pages 24-27.
  • Handle: RePEc:eee:ecolet:v:132:y:2015:i:c:p:24-27
    DOI: 10.1016/j.econlet.2015.03.042
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0165176515001640
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.econlet.2015.03.042?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Walstad, William B & Becker, William E, 1994. "Achievement Differences on Multiple-Choice and Essay Tests in Economics," American Economic Review, American Economic Association, vol. 84(2), pages 193-196, May.
    2. María Paz Espinosa & Javier Gardeazabal, 2013. "Do Students Behave Rationally in Multiple Choice Tests? Evidence from a Field Experiment," Journal of Economics and Management, College of Business, Feng Chia University, Taiwan, vol. 9(2), pages 107-135, July.
    3. Maya Bar-Hillel & David Budescu & Yigal Attali, 2005. "Scoring and keying multiple choice tests: A case study in irrationality," Mind & Society: Cognitive Studies in Economics and Social Sciences, Springer;Fondazione Rosselli, vol. 4(1), pages 3-12, June.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Rasmus A. X. Persson, 2023. "Theoretical evaluation of partial credit scoring of the multiple-choice test item," METRON, Springer;Sapienza Università di Roma, vol. 81(2), pages 143-161, August.
    2. Sylvain Béal & Sylvain Ferrière, 2019. "Examination design: an axiomatic approach," Working Papers 2019-05, CRESE.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Espinosa Maria Paz & Gardeazabal Javier, 2020. "The Gender-bias Effect of Test Scoring and Framing: A Concern for Personnel Selection and College Admission," The B.E. Journal of Economic Analysis & Policy, De Gruyter, vol. 20(3), pages 1-23, July.
    2. David Budescu & Yuanchao Bo, 2015. "Analyzing Test-Taking Behavior: Decision Theory Meets Psychometric Theory," Psychometrika, Springer;The Psychometric Society, vol. 80(4), pages 1105-1122, December.
    3. Nagore Iriberri & Pedro Rey-Biel, 2019. "Competitive Pressure Widens the Gender Gap in Performance: Evidence from a Two-stage Competition in Mathematics," The Economic Journal, Royal Economic Society, vol. 129(620), pages 1863-1893.
    4. María Paz Espinosa & Javier Gardeazabal, 2013. "Do Students Behave Rationally in Multiple Choice Tests? Evidence from a Field Experiment," Journal of Economics and Management, College of Business, Feng Chia University, Taiwan, vol. 9(2), pages 107-135, July.
    5. Jef Vanderoost & Rianne Janssen & Jan Eggermont & Riet Callens & Tinne De Laet, 2018. "Elimination testing with adapted scoring reduces guessing and anxiety in multiple-choice assessments, but does not increase grade average in comparison with negative marking," PLOS ONE, Public Library of Science, vol. 13(10), pages 1-27, October.
    6. Qian Wu & Monique Vanerum & Anouk Agten & Andrés Christiansen & Frank Vandenabeele & Jean-Michel Rigo & Rianne Janssen, 2021. "Certainty-Based Marking on Multiple-Choice Items: Psychometrics Meets Decision Theory," Psychometrika, Springer;The Psychometric Society, vol. 86(2), pages 518-543, June.
    7. Iriberri, Nagore & Rey-Biel, Pedro, 2021. "Brave boys and play-it-safe girls: Gender differences in willingness to guess in a large scale natural field experiment," European Economic Review, Elsevier, vol. 131(C).
    8. P. Everaert & N. Arthur, 2012. "Constructed-response versus multiple choice: the impact on performance in combination with gender," Working Papers of Faculty of Economics and Business Administration, Ghent University, Belgium 12/777, Ghent University, Faculty of Economics and Business Administration.
    9. Douglas McKee & Steven Zhu & George Orlov, 2023. "Econ-assessments.org: Automated Assessment of Economics Skills," Eastern Economic Journal, Palgrave Macmillan;Eastern Economic Association, vol. 49(1), pages 4-14, January.
    10. W. Robert Reed & Stephen Hickson, 2011. "More Evidence on the Use of Constructed-Response Questions in Principles of Economics Classes," International Review of Economic Education, Economics Network, University of Bristol, vol. 10(2), pages 28-49.
    11. Ellen Sewell, 2017. "Should I guess?," Applied Economics Letters, Taylor & Francis Journals, vol. 24(17), pages 1214-1217, October.
    12. Butler, Matthew J. & Cardon, James H. & Showalter, Mark H., 2017. "To choose or not to choose: An experiment in hedging strategies and risk preferences," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 67(C), pages 14-19.
    13. Pelin Akyol & James Key & Kala Krishna, 2022. "Hit or Miss? Test Taking Behavior in Multiple Choice Exams," Annals of Economics and Statistics, GENES, issue 147, pages 3-50.
    14. ,, 2008. "Risk taking and gender in hierarchies," Theoretical Economics, Econometric Society, vol. 3(4), December.
    15. repec:cup:judgdm:v:7:y:2012:i:2:p:165-172 is not listed on IDEAS
    16. W. Doyle Smith, 2002. "Applying Angelo's Teacher's Dozen to Undergraduate Introductory Economics Classes: A Call for Greater Interactive Learning," Eastern Economic Journal, Eastern Economic Association, vol. 28(4), pages 539-549, Fall.
    17. William E. Becker & Carol Johnston, 1999. "The Relationship between Multiple Choice and Essay Response Questions in Assessing Economics Understanding," The Economic Record, The Economic Society of Australia, vol. 75(4), pages 348-357, December.
    18. Montolio, Daniel & Taberner, Pere A., 2021. "Gender differences under test pressure and their impact on academic performance: A quasi-experimental design," Journal of Economic Behavior & Organization, Elsevier, vol. 191(C), pages 1065-1090.
    19. J. Ignacio Conde-Ruiz & Juan José Ganuza & Manuel García, 2020. "Gender Gap and Multiple Choice Exams in Public Selection Processes," Hacienda Pública Española / Review of Public Economics, IEF, vol. 235(4), pages 11-28, December.
    20. Bagues, Manuel & Perez-Villadoniga, Maria J., 2012. "Do recruiters prefer applicants with similar skills? Evidence from a randomized natural experiment," Journal of Economic Behavior & Organization, Elsevier, vol. 82(1), pages 12-20.

    More about this item

    Keywords

    Multiple-choice test; Scoring rules; Axiomatic approach;
    All these keywords.

    JEL classification:

    • C44 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods: Special Topics - - - Operations Research; Statistical Decision Theory
    • A2 - General Economics and Teaching - - Economic Education and Teaching of Economics
    • I20 - Health, Education, and Welfare - - Education - - - General

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:ecolet:v:132:y:2015:i:c:p:24-27. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/ecolet .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.