IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0270869.html
   My bibliography  Save this article

Psychometric properties of the item-reduced version of the comprehensive general parenting questionnaire for caregivers of preschoolers in a Finnish context

Author

Listed:
  • Carola Ray
  • Ester van der Borgh-Sleddens
  • Rejane Augusta de Oliveira Figueiredo
  • Jessica Gubbels
  • Mona Bjelland
  • Eva Roos

Abstract

Introduction: Many instruments for assessing general parenting have been reported as burdensome and are thus seldom used in studies exploring children’s energy balance-related behaviors or weight. This study evaluates the factorial structure of the item-reduced version of the Comprehensive General Parenting Questionnaire (CGPQ), which assesses five constructs of general parenting. Methods: The study uses data from two cross-sectional studies: Study 1 in 2014 (n = 173) and Study 2 in 2015–16 (n = 805). Parents of children aged three to six answered the CGPQ; in Study 1 the 69-item version, and in Study 2 the 29-item version. The reduction was based on the results of the confirmatory factor analyses (CFA) in Study 1. In both datasets, internal consistency, as Cronbach’s alphas and intraclass correlations between the items of each construct, was tested. A combined assessment of the CFA and items response theory evaluated the construct validity and the item importance for the 29-item version, and a further the reduced 22-item version. Results: In Study 1, the highest Cronbach’s alphas were shown for the five constructs in the 69-item version. A higher intraclass correlation was found between the constructs in the 69- and 29-item versions, than between the 69- and the 22-item version. However, a high concordance was found between the constructs in the 29- and 22-item versions in both Study 1 and in Study 2 (0.76–1.00). Testing the goodness-of-fit of the CFA models revealed that the 22-item model fulfilled all the criteria, showing that it had a better factorial structure than the 29-item model. Standard estimations ranged from 0.20 to 0.76 in the 22-item version. Conclusion: The reduced 22- and 29-item versions of the 69-item CGPQ showed good model fit, the 22-item version the better of the two. These short versions can be used to assess general parenting without overburdening the respondents.

Suggested Citation

  • Carola Ray & Ester van der Borgh-Sleddens & Rejane Augusta de Oliveira Figueiredo & Jessica Gubbels & Mona Bjelland & Eva Roos, 2022. "Psychometric properties of the item-reduced version of the comprehensive general parenting questionnaire for caregivers of preschoolers in a Finnish context," PLOS ONE, Public Library of Science, vol. 17(8), pages 1-15, August.
  • Handle: RePEc:plo:pone00:0270869
    DOI: 10.1371/journal.pone.0270869
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0270869
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0270869&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0270869?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Chalmers, R. Philip, 2012. "mirt: A Multidimensional Item Response Theory Package for the R Environment," Journal of Statistical Software, Foundation for Open Access Statistics, vol. 48(i06).
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Luo, Nanyu & Ji, Feng & Han, Yuting & He, Jinbo & Zhang, Xiaoya, 2024. "Fitting item response theory models using deep learning computational frameworks," OSF Preprints tjxab, Center for Open Science.
    2. Hoben, Matthias & Kilmen, Sevilay & Keefe, Janice & O'Rourke, Hannah M. & Banerjee, Sube & Estabrooks, Carole A., 2025. "Measurement invariance and differential item functioning of a care staff proxy measure of nursing home resident dementia-specific quality of life (DEMQOL-CH): do care aides' first language, and care a," Social Science & Medicine, Elsevier, vol. 375(C).
    3. Melissa Gladstone & Gillian Lancaster & Gareth McCray & Vanessa Cavallera & Claudia R. L. Alves & Limbika Maliwichi & Muneera A. Rasheed & Tarun Dua & Magdalena Janus & Patricia Kariger, 2021. "Validation of the Infant and Young Child Development (IYCD) Indicators in Three Countries: Brazil, Malawi and Pakistan," IJERPH, MDPI, vol. 18(11), pages 1-19, June.
    4. Björn Andersson & Tao Xin, 2021. "Estimation of Latent Regression Item Response Theory Models Using a Second-Order Laplace Approximation," Journal of Educational and Behavioral Statistics, , vol. 46(2), pages 244-265, April.
    5. Victoria T. Tanaka & George Engelhard & Matthew P. Rabbitt, 2020. "Using a Bifactor Model to Measure Food Insecurity in Households with Children," Journal of Family and Economic Issues, Springer, vol. 41(3), pages 492-504, September.
    6. Klaas Sijtsma & Jules L. Ellis & Denny Borsboom, 2024. "Recognize the Value of the Sum Score, Psychometrics’ Greatest Accomplishment," Psychometrika, Springer;The Psychometric Society, vol. 89(1), pages 84-117, March.
    7. Çetin Toraman & Güneş Korkmaz, 2023. "What is the “Meaning of School†to High School Students? A Scale Development and Implementation Study Based on IRT and CTT," SAGE Open, , vol. 13(3), pages 21582440231, September.
    8. Yikun Luo & Qipeng Chen & Jianyong Chen & Peida Zhan, 2024. "Development and validation of two shortened anxiety sensitive index-3 scales based on item response theory," Palgrave Communications, Palgrave Macmillan, vol. 11(1), pages 1-7, December.
    9. Cervantes, Víctor H., 2017. "DFIT: An R Package for Raju's Differential Functioning of Items and Tests Framework," Journal of Statistical Software, Foundation for Open Access Statistics, vol. 76(i05).
    10. Elina Tsigeman & Sebastian Silas & Klaus Frieler & Maxim Likhanov & Rebecca Gelding & Yulia Kovas & Daniel Müllensiefen, 2022. "The Jack and Jill Adaptive Working Memory Task: Construction, Calibration and Validation," PLOS ONE, Public Library of Science, vol. 17(1), pages 1-29, January.
    11. Joshua B. Gilbert & James S. Kim & Luke W. Miratrix, 2023. "Modeling Item-Level Heterogeneous Treatment Effects With the Explanatory Item Response Model: Leveraging Large-Scale Online Assessments to Pinpoint the Impact of Educational Interventions," Journal of Educational and Behavioral Statistics, , vol. 48(6), pages 889-913, December.
    12. Bing Li & Cody Ding & Huiying Shi & Fenghui Fan & Liya Guo, 2023. "Assessment of Psychological Zone of Optimal Performance among Professional Athletes: EGA and Item Response Theory Analysis," Sustainability, MDPI, vol. 15(10), pages 1-15, May.
    13. Joakim Wallmark & James O. Ramsay & Juan Li & Marie Wiberg, 2024. "Analyzing Polytomous Test Data: A Comparison Between an Information-Based IRT Model and the Generalized Partial Credit Model," Journal of Educational and Behavioral Statistics, , vol. 49(5), pages 753-779, October.
    14. Ick Hoon Jin & Minjeong Jeon, 2019. "A Doubly Latent Space Joint Model for Local Item and Person Dependence in the Analysis of Item Response Data," Psychometrika, Springer;The Psychometric Society, vol. 84(1), pages 236-260, March.
    15. Michela Gnaldi & Silvia Bacci & Thiemo Kunze & Samuel Greiff, 2020. "Students’ Complex Problem Solving Profiles," Psychometrika, Springer;The Psychometric Society, vol. 85(2), pages 469-501, June.
    16. Alessandro Chiarotto & Annette Bishop & Nadine E Foster & Kirsty Duncan & Ebenezer Afolabi & Raymond W Ostelo & Muirne C S Paap, 2018. "Item response theory evaluation of the biomedical scale of the Pain Attitudes and Beliefs Scale," PLOS ONE, Public Library of Science, vol. 13(9), pages 1-17, September.
    17. Alexander Robitzsch, 2021. "A Comprehensive Simulation Study of Estimation Methods for the Rasch Model," Stats, MDPI, vol. 4(4), pages 1-23, October.
    18. repec:osf:osfxxx:493x7_v3 is not listed on IDEAS
    19. Harald Hruschka, 2021. "Comparing unsupervised probabilistic machine learning methods for market basket analysis," Review of Managerial Science, Springer, vol. 15(2), pages 497-527, February.
    20. Sanny D Afable & Grace T Cruz & Yasuhiko Saito, 2023. "Sex differences in the psychometric properties of the Center for Epidemiological Studies–Depression (CES-D) Scale in older Filipinos," PLOS ONE, Public Library of Science, vol. 18(6), pages 1-14, June.
    21. Harold Doran, 2023. "A Collection of Numerical Recipes Useful for Building Scalable Psychometric Applications," Journal of Educational and Behavioral Statistics, , vol. 48(1), pages 37-69, February.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0270869. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.