IDEAS home Printed from https://ideas.repec.org/a/eee/csdana/v56y2012i5p1052-1060.html

Assessment of observer agreement for matched repeated binary measurements

Author

Listed:
  • Gao, Jingjing
  • Pan, Yi
  • Haber, Michael

Abstract

Agreement is a broad term simultaneously covering evaluations of accuracy and precision of measurements. Assessment of observer agreement is based on the similarity between readings made on the same subject by different observers. The assessment of agreement on categorical observations is traditionally based on kappa or weighted kappa coefficients. However, kappa statistics have been criticized because they attain implausible values when the marginal distributions are skewed and/or unbalanced. New scaled indices called the coefficients of individual agreement (CIAs) have been developed for the assessment of individual observer agreement by comparing the observed disagreement between two observers to the disagreement between replicated observations made by the same observer on the same subject. This is based on the notion that under a good agreement, the disagreement between the two observers is usually not expected to exceed the disagreement between replicated observations of the same observer, and hence, a satisfactory agreement is established if these quantities are similar. This idea is extended and a new method utilizing the generalized linear mixed model is proposed to estimate the CIAs for binary data which consist of matched sets of repeated measurements made by the same observer under different conditions. The conditions may represent different time points, raters, laboratories, treatments, etc. The new approach allows the values of the measured variable and the magnitude of agreement to vary across the conditions. The reliability of the estimation method is examined via a simulation study. Data from a study aiming at determining the validity of diagnosis of breast cancer based on mammography are used to illustrate the new concepts and methods.

Suggested Citation

  • Gao, Jingjing & Pan, Yi & Haber, Michael, 2012. "Assessment of observer agreement for matched repeated binary measurements," Computational Statistics & Data Analysis, Elsevier, vol. 56(5), pages 1052-1060.
  • Handle: RePEc:eee:csdana:v:56:y:2012:i:5:p:1052-1060
    DOI: 10.1016/j.csda.2011.11.005
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0167947311004026
    Download Restriction: Full text for ScienceDirect subscribers only.

    File URL: https://libkey.io/10.1016/j.csda.2011.11.005?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to

    for a different version of it.

    References listed on IDEAS

    as
    1. Huiman X. Barnhart & Michael Haber & Jingli Song, 2002. "Overall Concordance Correlation Coefficient for Evaluating Agreement Among Multiple Observers," Biometrics, The International Biometric Society, vol. 58(4), pages 1020-1027, December.
    2. Helena Kraemer, 1979. "Ramifications of a population model forκ as a coefficient of reliability," Psychometrika, Springer;The Psychometric Society, vol. 44(4), pages 461-472, December.
    3. Josep L. Carrasco, 2010. "A Generalized Concordance Correlation Coefficient Based on the Variance Components Generalized Linear Mixed Models for Overdispersed Count Data," Biometrics, The International Biometric Society, vol. 66(3), pages 897-904, September.
    4. Lin L. & Hedayat A. S. & Sinha B. & Yang M., 2002. "Statistical Methods in Assessing Agreement: Models, Issues, and Tools," Journal of the American Statistical Association, American Statistical Association, vol. 97, pages 257-270, March.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Tsai, Miao-Yu, 2015. "Comparison of concordance correlation coefficient via variance components, generalized estimating equations and weighted approaches with model selection," Computational Statistics & Data Analysis, Elsevier, vol. 82(C), pages 47-58.
    2. Chen, Chia-Cheng & Barnhart, Huiman X., 2008. "Comparison of ICC and CCC for assessing agreement for data without and with replications," Computational Statistics & Data Analysis, Elsevier, vol. 53(2), pages 554-564, December.
    3. Tsai, Miao-Yu & Lin, Chao-Chun, 2018. "Concordance correlation coefficients estimated by variance components for longitudinal normal and Poisson data," Computational Statistics & Data Analysis, Elsevier, vol. 121(C), pages 57-70.
    4. Admassu N. Lamu, 2020. "Does linear equating improve prediction in mapping? Crosswalking MacNew onto EQ-5D-5L value sets," The European Journal of Health Economics, Springer;Deutsche Gesellschaft für Gesundheitsökonomie (DGGÖ), vol. 21(6), pages 903-915, August.
    5. Matthijs Warrens, 2010. "A Formal Proof of a Paradox Associated with Cohen’s Kappa," Journal of Classification, Springer;The Classification Society, vol. 27(3), pages 322-332, November.
    6. Jose M. Jimenez-Olmedo & Alfonso Penichet-Tomas & Basilio Pueo & Lamberto Villalon-Gasch, 2023. "Reliability of ADR Jumping Photocell: Comparison of Beam Cut at Forefoot and Midfoot," IJERPH, MDPI, vol. 20(11), pages 1-13, May.
    7. John J. Chen & Guangxiang Zhang & Chen Ji & George F. Steinhardt, 2011. "Simple moment-based inferences of generalized concordance correlation," Journal of Applied Statistics, Taylor & Francis Journals, vol. 38(9), pages 1867-1882, October.
    8. Yan Ma & Wan Tang & Changyong Feng & Xin M. Tu, 2008. "Inference for Kappas for Longitudinal Study Data: Applications to Sexual Health Research," Biometrics, The International Biometric Society, vol. 64(3), pages 781-789, September.
    9. repec:plo:pone00:0181918 is not listed on IDEAS
    10. Choudhary, Pankaj K., 2007. "Semiparametric regression for assessing agreement using tolerance bands," Computational Statistics & Data Analysis, Elsevier, vol. 51(12), pages 6229-6241, August.
    11. Matthijs J. Warrens, 2021. "Kappa coefficients for dichotomous-nominal classifications," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 15(1), pages 193-208, March.
    12. Helenowski Irene B & Vonesh Edward F & Demirtas Hakan & Rademaker Alfred W & Ananthanarayanan Vijayalakshmi & Gann Peter H & Jovanovic Borko D, 2011. "Defining Reproducibility Statistics as a Function of the Spatial Covariance Structures in Biomarker Studies," The International Journal of Biostatistics, De Gruyter, vol. 7(1), pages 1-21, January.
    13. Choudhary Pankaj K, 2010. "A Unified Approach for Nonparametric Evaluation of Agreement in Method Comparison Studies," The International Journal of Biostatistics, De Gruyter, vol. 6(1), pages 1-26, June.
    14. Ying Guo & Amita K. Manatunga, 2007. "Nonparametric Estimation of the Concordance Correlation Coefficient under Univariate Censoring," Biometrics, The International Biometric Society, vol. 63(1), pages 164-172, March.
    15. Tarald O. Kvålseth, 2018. "An Alternative Interpretation of the Linearly Weighted Kappa Coefficients for Ordinal Data," Psychometrika, Springer;The Psychometric Society, vol. 83(3), pages 618-627, September.
    16. Candelaria de la Merced Díaz‐González & Milagros de la Rosa‐Hormiga & Josefa M. Ramal‐López & Juan José González‐Henríquez & María Sandra Marrero‐Morales, 2018. "Factors which influence concordance among measurements obtained by different pulse oximeters currently used in some clinical situations," Journal of Clinical Nursing, John Wiley & Sons, vol. 27(3-4), pages 677-683, February.
    17. Yang, Zhao & Zhou, Ming, 2015. "Kappa statistic for clustered physician–patients polytomous data," Computational Statistics & Data Analysis, Elsevier, vol. 87(C), pages 1-17.
    18. Liao Jason J. Z. & Capen Robert, 2011. "An Improved Bland-Altman Method for Concordance Assessment," The International Journal of Biostatistics, De Gruyter, vol. 7(1), pages 1-17, January.
    19. Jordan A. Carlson & Bo Liu & James F. Sallis & Jacqueline Kerr & J. Aaron Hipp & Vincent S. Staggs & Amy Papa & Kelsey Dean & Nuno M. Vasconcelos, 2017. "Automated Ecological Assessment of Physical Activity: Advancing Direct Observation," IJERPH, MDPI, vol. 14(12), pages 1-15, December.
    20. Hutson, Alan D., 2010. "A multi-rater nonparametric test of agreement and corresponding agreement plot," Computational Statistics & Data Analysis, Elsevier, vol. 54(1), pages 109-119, January.
    21. Matthijs Warrens, 2011. "Cohen’s Linearly Weighted Kappa is a Weighted Average of 2×2 Kappas," Psychometrika, Springer;The Psychometric Society, vol. 76(3), pages 471-486, July.

    More about this item

    Keywords

    ;
    ;
    ;
    ;
    ;

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:csdana:v:56:y:2012:i:5:p:1052-1060. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/csda .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.