IDEAS home Printed from https://ideas.repec.org/a/gam/jscscx/v8y2019i2p46-d203505.html
   My bibliography  Save this article

Measurement Invariance of a Direct Behavior Rating Multi Item Scale across Occasions

Author

Listed:
  • Markus Gebhardt

    (Research in Inclusive Education, Faculty of Rehabilitation Science, Technical University of Dortmund, 44227 Dortmund, Germany)

  • Jeffrey M. DeVries

    (Research in Inclusive Education, Faculty of Rehabilitation Science, Technical University of Dortmund, 44227 Dortmund, Germany)

  • Jana Jungjohann

    (Research in Inclusive Education, Faculty of Rehabilitation Science, Technical University of Dortmund, 44227 Dortmund, Germany)

  • Gino Casale

    (Department of Special Education, University of Cologne, 50931 Cologne, Germany)

  • Andreas Gegenfurtner

    (Deggendorf Institute of Technology, Institute for Quality and Continuing Education, 94469 Deggendorf, Germany)

  • Jörg-Tobias Kuhn

    (Faculty of Rehabilitation Science, Educational Research Methods, Technical University of Dortmund, 44227 Dortmund, Germany)

Abstract

Direct Behavior Rating (DBR) as a behavioral progress monitoring tool can be designed as longitudinal assessment with only short intervals between measurement points. The reliability of these instruments has been mostly evaluated in observational studies with small samples based on generalizability theory. However, for a standardized use in the pedagogical field, a larger and broader sample is required in order to assess measurement invariance between different participant groups and over time. Therefore, we constructed a DBR, the Questionnaire for Monitoring Behavior in Schools (QMBS) with multiple items to measure the occurrence of specific externalizing and internalizing student classroom behaviors on a Likert scale (1 = never to 7 = always). In a pilot study, two trained raters observed 16 primary education students and rated the student behavior over all items with a satisfactory reliability. In the main study, 108 regular primary school students, 97 regular secondary students, and 14 students in a clinical setting were rated daily over one week (five measurement points). Item response theory (IRT) analyses confirmed the technical adequacy of the instrument and latent growth models demonstrated the instrument’s stability over time. Further development of the instrument and study designs to implement DBRs is discussed.

Suggested Citation

  • Markus Gebhardt & Jeffrey M. DeVries & Jana Jungjohann & Gino Casale & Andreas Gegenfurtner & Jörg-Tobias Kuhn, 2019. "Measurement Invariance of a Direct Behavior Rating Multi Item Scale across Occasions," Social Sciences, MDPI, vol. 8(2), pages 1-14, February.
  • Handle: RePEc:gam:jscscx:v:8:y:2019:i:2:p:46-:d:203505
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2076-0760/8/2/46/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2076-0760/8/2/46/
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Thomas Warm, 1989. "Weighted likelihood estimation of ability in item response theory," Psychometrika, Springer;The Psychometric Society, vol. 54(3), pages 427-450, September.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Cherier Mae G. Logroño & Celso L. Tagadiad, 2023. "Instructional Leadership and Ethical Climate as Determinants of School Connectedness," International Journal of Research and Innovation in Social Science, International Journal of Research and Innovation in Social Science (IJRISS), vol. 7(1), pages 750-769, January.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Torberg Falch & Justina AV Fischer, 2008. "Does a generous welfare state crowd out student achievement? Panel data evidence from international student tests," TWI Research Paper Series 31, Thurgauer Wirtschaftsinstitut, Universität Konstanz.
    2. Steger, Diana & Schroeders, Ulrich & Wilhelm, Oliver, 2019. "On the dimensionality of crystallized intelligence: A smartphone-based assessment," Intelligence, Elsevier, vol. 72(C), pages 76-85.
    3. Michela Battauz & Ruggero Bellio, 2011. "Structural Modeling of Measurement Error in Generalized Linear Models with Rasch Measures as Covariates," Psychometrika, Springer;The Psychometric Society, vol. 76(1), pages 40-56, January.
    4. Xiang Liu & James Yang & Hui Soo Chae & Gary Natriello, 2020. "Power Divergence Family of Statistics for Person Parameters in IRT Models," Psychometrika, Springer;The Psychometric Society, vol. 85(2), pages 502-525, June.
    5. Klaas Sijtsma & Jules L. Ellis & Denny Borsboom, 2024. "Recognize the Value of the Sum Score, Psychometrics’ Greatest Accomplishment," Psychometrika, Springer;The Psychometric Society, vol. 89(1), pages 84-117, March.
    6. repec:zbw:rwidps:0023 is not listed on IDEAS
    7. Schmidt, Christoph & Fertig, Michael, 2002. "The Role of Background Factors for Reading Literacy: Straight National scores in the Pisa 2000 Study," CEPR Discussion Papers 3544, C.E.P.R. Discussion Papers.
    8. Robitzsch, Alexander, 2020. "About Still Nonignorable Consequences of (Partially) Ignoring Missing Item Responses in Large-scale Assessment," OSF Preprints hmy45, Center for Open Science.
    9. Fertig, Michael, 2003. "Educational Production, Endogenous Peer Group Formation and Class Composition – Evidence from the PISA 2000 Study," IZA Discussion Papers 714, Institute of Labor Economics (IZA).
    10. Elina Tsigeman & Sebastian Silas & Klaus Frieler & Maxim Likhanov & Rebecca Gelding & Yulia Kovas & Daniel Müllensiefen, 2022. "The Jack and Jill Adaptive Working Memory Task: Construction, Calibration and Validation," PLOS ONE, Public Library of Science, vol. 17(1), pages 1-29, January.
    11. Martin Biehler & Heinz Holling & Philipp Doebler, 2015. "Saddlepoint Approximations of the Distribution of the Person Parameter in the Two Parameter Logistic Model," Psychometrika, Springer;The Psychometric Society, vol. 80(3), pages 665-688, September.
    12. Liu, Huacong & Fernandez, Frank & Dutz, Gregor, 2022. "Educational attainment, use of numeracy at work, and gender wage gaps: Evidence from 12 middle-income countries," International Journal of Educational Development, Elsevier, vol. 92(C).
    13. Alexander Robitzsch, 2021. "A Comprehensive Simulation Study of Estimation Methods for the Rasch Model," Stats, MDPI, vol. 4(4), pages 1-23, October.
    14. J. R. Lockwood & Katherine E. Castellano & Benjamin R. Shear, 2018. "Flexible Bayesian Models for Inferences From Coarsened, Group-Level Achievement Data," Journal of Educational and Behavioral Statistics, , vol. 43(6), pages 663-692, December.
    15. J. R. Lockwood & Daniel F. McCaffrey, 2017. "Simulation-Extrapolation with Latent Heteroskedastic Error Variance," Psychometrika, Springer;The Psychometric Society, vol. 82(3), pages 717-736, September.
    16. Sandip Sinharay, 2015. "The Asymptotic Distribution of Ability Estimates," Journal of Educational and Behavioral Statistics, , vol. 40(5), pages 511-528, October.
    17. David Andrich, 2010. "Sufficiency and Conditional Estimation of Person Parameters in the Polytomous Rasch Model," Psychometrika, Springer;The Psychometric Society, vol. 75(2), pages 292-308, June.
    18. Ivo Molenaar, 1998. "Data, model, conclusion, doing it again," Psychometrika, Springer;The Psychometric Society, vol. 63(4), pages 315-340, December.
    19. Michael Fertig, 2002. "Educational Production, Endogenous Peer Group Formation and Class Composition – Evidence From the PISA 2000 Study," RWI Discussion Papers 0002, Rheinisch-Westfälisches Institut für Wirtschaftsforschung.
    20. Sandip Sinharay & Jens Ledet Jensen, 2019. "Higher-Order Asymptotics and Its Application to Testing the Equality of the Examinee Ability Over Two Sets of Items," Psychometrika, Springer;The Psychometric Society, vol. 84(2), pages 484-510, June.
    21. Oberrauch, Luis & Kaiser, Tim, 2020. "Economic competence in early secondary school: Evidence from a large-scale assessment in Germany," International Review of Economics Education, Elsevier, vol. 35(C).

    More about this item

    Keywords

    ;
    ;
    ;
    ;
    ;
    ;
    ;

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jscscx:v:8:y:2019:i:2:p:46-:d:203505. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.