IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0227196.html
   My bibliography  Save this article

Adapting cognitive diagnosis computerized adaptive testing item selection rules to traditional item response theory

Author

Listed:
  • Miguel A Sorrel
  • Juan R Barrada
  • Jimmy de la Torre
  • Francisco José Abad

Abstract

Currently, there are two predominant approaches in adaptive testing. One, referred to as cognitive diagnosis computerized adaptive testing (CD-CAT), is based on cognitive diagnosis models, and the other, the traditional CAT, is based on item response theory. The present study evaluates the performance of two item selection rules (ISRs) originally developed in the CD-CAT framework, the double Kullback-Leibler information (DKL) and the generalized deterministic inputs, noisy “and” gate model discrimination index (GDI), in the context of traditional CAT. The accuracy and test security associated with these two ISRs are compared to those of the point Fisher information and weighted KL using a simulation study. The impact of the trait level estimation method is also investigated. The results show that the new ISRs, particularly DKL, could be used to improve the accuracy of CAT. Better accuracy for DKL is achieved at the expense of higher item overlap rate. Differences among the item selection rules become smaller as the test gets longer. The two CD-CAT ISRs select different types of items: items with the highest possible a parameter with DKL, and items with the lowest possible c parameter with GDI. Regarding the trait level estimator, expected a posteriori method is generally better in the first stages of the CAT, and converges with the maximum likelihood method when a medium to large number of items are involved. The use of DKL can be recommended in low-stakes settings where test security is less of a concern.

Suggested Citation

  • Miguel A Sorrel & Juan R Barrada & Jimmy de la Torre & Francisco José Abad, 2020. "Adapting cognitive diagnosis computerized adaptive testing item selection rules to traditional item response theory," PLOS ONE, Public Library of Science, vol. 15(1), pages 1-17, January.
  • Handle: RePEc:plo:pone00:0227196
    DOI: 10.1371/journal.pone.0227196
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0227196
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0227196&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0227196?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Ying Cheng, 2009. "When Cognitive Diagnosis Meets Computerized Adaptive Testing: CD-CAT," Psychometrika, Springer;The Psychometric Society, vol. 74(4), pages 619-632, December.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Qingrong Tan & Yan Cai & Fen Luo & Dongbo Tu, 2023. "Development of a High-Accuracy and Effective Online Calibration Method in CD-CAT Based on Gini Index," Journal of Educational and Behavioral Statistics, , vol. 48(1), pages 103-141, February.
    2. Yan Li & Chao Huang & Jia Liu, 2023. "Diagnosing Primary Students’ Reading Progression: Is Cognitive Diagnostic Computerized Adaptive Testing the Way Forward?," Journal of Educational and Behavioral Statistics, , vol. 48(6), pages 842-865, December.
    3. Chia-Yi Chiu & Yuan-Pei Chang, 2021. "Advances in CD-CAT: The General Nonparametric Item Selection Method," Psychometrika, Springer;The Psychometric Society, vol. 86(4), pages 1039-1057, December.
    4. Hong-Yun Liu & Xiao-Feng You & Wen-Yi Wang & Shu-Liang Ding & Hua-Hua Chang, 2013. "The Development of Computerized Adaptive Testing with Cognitive Diagnosis for an English Achievement Test in China," Journal of Classification, Springer;The Classification Society, vol. 30(2), pages 152-172, July.
    5. Wenyi Wang & Lihong Song & Teng Wang & Peng Gao & Jian Xiong, 2020. "A Note on the Relationship of the Shannon Entropy Procedure and the Jensen–Shannon Divergence in Cognitive Diagnostic Computerized Adaptive Testing," SAGE Open, , vol. 10(1), pages 21582440198, January.
    6. Ragip Terzi & Sedat Sen, 2019. "A Nondiagnostic Assessment for Diagnostic Purposes: Q-Matrix Validation and Item-Based Model Fit Evaluation for the TIMSS 2011 Assessment," SAGE Open, , vol. 9(1), pages 21582440198, February.
    7. Crabbe, Marjolein & Akinc, Deniz & Vandebroek, Martina, 2014. "Fast algorithms to generate individualized designs for the mixed logit choice model," Transportation Research Part B: Methodological, Elsevier, vol. 60(C), pages 1-15.
    8. Ping Chen & Tao Xin & Chun Wang & Hua-Hua Chang, 2012. "Online Calibration Methods for the DINA Model with Independent Attributes in CD-CAT," Psychometrika, Springer;The Psychometric Society, vol. 77(2), pages 201-222, April.
    9. Pasquale Anselmi & Egidio Robusto & Luca Stefanutti & Debora Chiusole, 2016. "An Upgrading Procedure for Adaptive Assessment of Knowledge," Psychometrika, Springer;The Psychometric Society, vol. 81(2), pages 461-482, June.
    10. Hung-Yu Huang, 2018. "Effects of Item Calibration Errors on Computerized Adaptive Testing under Cognitive Diagnosis Models," Journal of Classification, Springer;The Classification Society, vol. 35(3), pages 437-465, October.
    11. Chun Wang & Hua-Hua Chang, 2011. "Item Selection in Multidimensional Computerized Adaptive Testing—Gaining Information from Different Angles," Psychometrika, Springer;The Psychometric Society, vol. 76(3), pages 363-384, July.
    12. Xuliang Gao & Daxun Wang & Yan Cai & Dongbo Tu, 2020. "Cognitive Diagnostic Computerized Adaptive Testing for Polytomously Scored Items," Journal of Classification, Springer;The Classification Society, vol. 37(3), pages 709-729, October.
    13. Jingchen Liu & Zhiliang Ying & Stephanie Zhang, 2015. "A Rate Function Approach to Computerized Adaptive Testing for Cognitive Diagnosis," Psychometrika, Springer;The Psychometric Society, vol. 80(2), pages 468-490, June.
    14. Hua-Hua Chang, 2015. "Psychometrics Behind Computerized Adaptive Testing," Psychometrika, Springer;The Psychometric Society, vol. 80(1), pages 1-20, March.
    15. Magis, David & Barrada, Juan Ramon, 2017. "Computerized Adaptive Testing with R: Recent Updates of the Package catR," Journal of Statistical Software, Foundation for Open Access Statistics, vol. 76(c01).
    16. Nathan D. Minchen & Jimmy de la Torre & Ying Liu, 2017. "A Cognitive Diagnosis Model for Continuous Response," Journal of Educational and Behavioral Statistics, , vol. 42(6), pages 651-677, December.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0227196. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.