IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0203931.html
   My bibliography  Save this article

Elimination testing with adapted scoring reduces guessing and anxiety in multiple-choice assessments, but does not increase grade average in comparison with negative marking

Author

Listed:
  • Jef Vanderoost
  • Rianne Janssen
  • Jan Eggermont
  • Riet Callens
  • Tinne De Laet

Abstract

Background and hypotheses: This study is the first to offer an in-depth comparison of elimination testing with the scoring rule of Arnold & Arnold (hereafter referred to as elimination testing with adapted scoring) and negative marking. As such, this study is motivated by the search for an alternative for negative marking that still discourages guessing, but is less disadvantageous for non-relevant student characteristics such a risk-aversion and does not result in grade inflation. The comparison is structured around seven hypotheses: in comparison with negative marking, elimination testing with adapted scoring leads to (1) a similar average score (no grade inflation); (2) students expressing their partial knowledge; (3) a decrease in the number of blank answers; (4) no gender bias in the number of blank answers; (5) a reduction in guessing; (6) a decrease in self-reported test anxiety; and finally (7) students preferring elimination testing with adapted scoring over negative marking. Methodology: To investigate the above hypotheses, this study implemented elimination testing with adapted scoring and negative marking in real exam settings in two courses in a Faculty of Medicine at a large university. Due to changes in the master of medicine the same two courses were taught to both students of the 1st and 2nd master in the same semester. Given that both student groups could take the same exam with different test instructions and scoring methods, a unique opportunity occurred in which elimination testing with adapted scoring and negative marking could be compared in a high-stakes testing situation. After receiving the grades on the exams, students received a questionnaire to assess their experiences. Findings: The statistical analysis taking into account student ability and gender showed that elimination testing with adapted scoring is a valuable alternative for negative marking when looking for a scoring method that discourages guessing. In contrast to traditional scoring of elimination testing, elimination testing with adapted scoring does not result in grade inflation in comparison with negative marking. This study showed that elimination testing with adapted scoring reduces blank answers and finds strong indications for the reduction of guessing in comparison with negative marking. Finally, students preferred elimination testing with adapted scoring over negative marking and reported lower stress levels in elimination testing with adapted scoring in comparison with negative marking.

Suggested Citation

  • Jef Vanderoost & Rianne Janssen & Jan Eggermont & Riet Callens & Tinne De Laet, 2018. "Elimination testing with adapted scoring reduces guessing and anxiety in multiple-choice assessments, but does not increase grade average in comparison with negative marking," PLOS ONE, Public Library of Science, vol. 13(10), pages 1-27, October.
  • Handle: RePEc:plo:pone00:0203931
    DOI: 10.1371/journal.pone.0203931
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0203931
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0203931&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0203931?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Pelin Akyol & James Key & Kala Krishna, 2022. "Hit or Miss? Test Taking Behavior in Multiple Choice Exams," Annals of Economics and Statistics, GENES, issue 147, pages 3-50.
    2. Riener, Gerhard & Wagner, Valentin, 2017. "Shying away from demanding tasks? Experimental evidence on gender differences in answering multiple-choice questions," Economics of Education Review, Elsevier, vol. 59(C), pages 43-62.
    3. Pekkarinen, Tuomas, 2015. "Gender differences in behaviour under competitive pressure: Evidence on omission patterns in university entrance examinations," Journal of Economic Behavior & Organization, Elsevier, vol. 115(C), pages 94-110.
    4. Eckel, Catherine C. & Grossman, Philip J., 2008. "Men, Women and Risk Aversion: Experimental Evidence," Handbook of Experimental Economics Results, in: Charles R. Plott & Vernon L. Smith (ed.), Handbook of Experimental Economics Results, edition 1, volume 1, chapter 113, pages 1061-1073, Elsevier.
    5. A Elizabeth Bond & Owen Bodger & David O F Skibinski & D Hugh Jones & Colin J Restall & Edward Dudley & Geertje van Keulen, 2013. "Negatively-Marked MCQ Assessments That Reward Partial Knowledge Do Not Introduce Gender Bias Yet Increase Student Performance and Satisfaction and Reduce Anxiety," PLOS ONE, Public Library of Science, vol. 8(2), pages 1-10, February.
    6. Maya Bar-Hillel & David Budescu & Yigal Attali, 2005. "Scoring and keying multiple choice tests: A case study in irrationality," Mind & Society: Cognitive Studies in Economics and Social Sciences, Springer;Fondazione Rosselli, vol. 4(1), pages 3-12, June.
    7. Katherine Baldiga, 2014. "Gender Differences in Willingness to Guess," Management Science, INFORMS, vol. 60(2), pages 434-448, February.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Espinosa Maria Paz & Gardeazabal Javier, 2020. "The Gender-bias Effect of Test Scoring and Framing: A Concern for Personnel Selection and College Admission," The B.E. Journal of Economic Analysis & Policy, De Gruyter, vol. 20(3), pages 1-23, July.
    2. Rasmus A. X. Persson, 2023. "Theoretical evaluation of partial credit scoring of the multiple-choice test item," METRON, Springer;Sapienza Università di Roma, vol. 81(2), pages 143-161, August.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Iriberri, Nagore & Rey-Biel, Pedro, 2021. "Brave boys and play-it-safe girls: Gender differences in willingness to guess in a large scale natural field experiment," European Economic Review, Elsevier, vol. 131(C).
    2. Anaya, Lina & Iriberri, Nagore & Rey-Biel, Pedro & Zamarro, Gema, 2022. "Understanding performance in test taking: The role of question difficulty order," Economics of Education Review, Elsevier, vol. 90(C).
    3. Montolio, Daniel & Taberner, Pere A., 2021. "Gender differences under test pressure and their impact on academic performance: A quasi-experimental design," Journal of Economic Behavior & Organization, Elsevier, vol. 191(C), pages 1065-1090.
    4. Saygin, Perihan O. & Atwater, Ann, 2021. "Gender differences in leaving questions blank on high-stakes standardized tests," Economics of Education Review, Elsevier, vol. 84(C).
    5. Qian Wu & Monique Vanerum & Anouk Agten & Andrés Christiansen & Frank Vandenabeele & Jean-Michel Rigo & Rianne Janssen, 2021. "Certainty-Based Marking on Multiple-Choice Items: Psychometrics Meets Decision Theory," Psychometrika, Springer;The Psychometric Society, vol. 86(2), pages 518-543, June.
    6. Riener, Gerhard & Wagner, Valentin, 2018. "Gender differences in willingness to compete and answering multiple-choice questions—The role of age," Economics Letters, Elsevier, vol. 164(C), pages 86-89.
    7. J. Ignacio Conde-Ruiz & Juan José Ganuza & Manuel García, 2020. "Gender Gap and Multiple Choice Exams in Public Selection Processes," Hacienda Pública Española / Review of Public Economics, IEF, vol. 235(4), pages 11-28, December.
    8. Pau Balart & Lara Ezquerra & Iñigo Hernandez-Arenaz, 2022. "Framing effects on risk-taking behavior: evidence from a field experiment in multiple-choice tests," Experimental Economics, Springer;Economic Science Association, vol. 25(4), pages 1268-1297, September.
    9. Wagner, Valentin, 2016. "Seeking risk or answering smart? Framing in elementary schools," DICE Discussion Papers 227, Heinrich Heine University Düsseldorf, Düsseldorf Institute for Competition Economics (DICE).
    10. Espinosa Maria Paz & Gardeazabal Javier, 2020. "The Gender-bias Effect of Test Scoring and Framing: A Concern for Personnel Selection and College Admission," The B.E. Journal of Economic Analysis & Policy, De Gruyter, vol. 20(3), pages 1-23, July.
    11. Claire Duquennois, 2022. "Fictional Money, Real Costs: Impacts of Financial Salience on Disadvantaged Students," American Economic Review, American Economic Association, vol. 112(3), pages 798-826, March.
    12. Riener, Gerhard & Wagner, Valentin, 2017. "Shying away from demanding tasks? Experimental evidence on gender differences in answering multiple-choice questions," Economics of Education Review, Elsevier, vol. 59(C), pages 43-62.
    13. Maddalena Davoli, 2023. "A, B, or C? Question Format and the Gender Gap in Financial Literacy," Economics of Education Working Paper Series 0206, University of Zurich, Department of Business Administration (IBW).
    14. Wagner, Valentin, 2016. "Seeking Risk or Answering Smart? Experimental Evidence on Framing Effects in Elementary Schools," VfS Annual Conference 2016 (Augsburg): Demographic Change 145678, Verein für Socialpolitik / German Economic Association.
    15. Bottazzi, Laura & Lusardi, Annamaria, 2021. "Stereotypes in financial literacy: Evidence from PISA," Journal of Corporate Finance, Elsevier, vol. 71(C).
    16. Agrawal, Anjali & Green, Ellen P. & Lavergne, Lisa, 2019. "Gender effects in the credence goods market: An experimental study," Economics Letters, Elsevier, vol. 174(C), pages 195-199.
    17. Tabea Bucher-Koenen & Rob Alessie & Annamaria Lusardi & Maarten van Rooij, 2021. "Fearless Woman. Financial Literacy and Stock Market Participation," Working Papers 708, DNB.
    18. Catherine Eckel & Lata Gangadharan & Philip J. Grossman & Nina Xue, 2021. "The gender leadership gap: insights from experiments," Chapters, in: Ananish Chaudhuri (ed.), A Research Agenda for Experimental Economics, chapter 7, pages 137-162, Edward Elgar Publishing.
    19. Pelin Akyol, 2021. "Comparison of Computer-based and Paper-based Exams: Evidence from PISA," Bogazici Journal, Review of Social, Economic and Administrative Studies, Bogazici University, Department of Economics, vol. 35(2), pages 137-150.
    20. Silvia Griselda, 2020. "Different Questions, Different Gender Gap: Can the Format of Questions Explain the Gender Gap in Mathematics?," 2020 Papers pgr710, Job Market Papers.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0203931. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.