IDEAS home Printed from https://ideas.repec.org/a/wly/camsys/v21y2025i1ne70014.html
   My bibliography  Save this article

Critical appraisal of methodological quality and completeness of reporting in Chinese social science systematic reviews with meta‐analysis: A systematic review

Author

Listed:
  • Liping Guo
  • Sarah Miller
  • Wenjie Zhou
  • Zhipeng Wei
  • Junjie Ren
  • Xinyu Huang
  • Xin Xing
  • Howard White
  • Kehu Yang

Abstract

Background A systematic review is a type of literature review that uses rigorous methods to synthesize evidence from multiple studies on a specific topic. It is widely used in academia, including medical and social science research. Social science is an academic discipline that focuses on human behaviour and society. However, consensus regarding the standards and criteria for conducting and reporting systematic reviews in social science is lacking. Previous studies have found that the quality of systematic reviews in social science varies depending on the topic, database, and country. Objectives This study evaluates the completeness of reporting and methodological quality of intervention and non‐intervention systematic reviews in social science in China. Additionally, we explore factors that may influence quality. Search Methods We searched three major Chinese electronic databases—CNKI, VIP, and Wangfang—for intervention and non‐intervention reviews in social science published in Chinese journals from 1 January 2009 to 2 December 2022. Selection Criteria We included intervention and non‐intervention reviews; however, we excluded overviews, qualitative syntheses, integrative reviews, rapid reviews, and evidence syntheses/summaries. We also excluded meta‐analyses that used advanced methods (e.g., cross‐sectional, cumulative, Bayesian, structural equation, or network meta‐analyses) or that focused on instrument validation. Data Collection and Analysis We extracted data using a coding form with publication information and study content characteristics. This study conducted pilot extraction and quality assessment with four authors and formal extraction and assessment with two groups of four authors each. PRISMA2020 and MOOSE were used to evaluate the reporting completeness of intervention and non‐intervention reviews. AMSTAR‐2 and DART tools were adopted to assess their methodological quality. We described the characteristics of the included reviews with frequencies and percentages. We used SPSS (version 26.0) to conduct a linear regression analysis and ANOVA to explore the factors that may influence both completeness of reporting and methodological quality. Main Results We included 1176 systematic reviews with meta‐analyses published in Chinese journals between 2009 and 2022. The top three fields of publication were psychology (417, 35.5%), education (388, 33.0%), and management science (264, 22.4%). Four hundred and thirty‐two intervention reviews were included. The overall completeness of reporting in PRISMA and compliance rate of the methodological process in AMSTAT‐2 were 49.9% and 45.5%, respectively. Intervention reviews published in Chinese Social Science Citation Index (CSSCI) journals had lower reporting completeness than those published in non‐CSSCI journals (46.7% vs. 51.1%), similar to methodological quality (39.6% vs. 47.9%). A few reviews reported the details on registration (0.2%), rationality of study selection criteria (1.6%), sources of funding for primary studies (0.2%), reporting bias assessment (2.8%), certainty of evidence assessment (1.2%), and sensitivity analysis (107, 24.8%). Seven hundred and forty‐four non‐intervention reviews were included. The overall completeness of reporting in MOOSE and compliance rate of the methodological process in DART were 51.8% and 50.5%, respectively. Non‐intervention reviews published in CSSCI journals had higher reporting completeness than those published in non‐CSSCI journals (53.3% vs. 50.3%); however, there was no difference in methodological quality (51.0% vs. 50.0%). Most reviews did not report the process and results of selection (80.8%), and 58.9% of reviews did not describe the process of data extraction; only 9.5% assessed the quality of included studies; while none of the reviews examined bias by confounding, outcome reporting bias, and loss to follow‐up. An improving trend over time was observed for both intervention and non‐intervention reviews in completeness of reporting and methodological quality (PRISMA: β = 0.24, p

Suggested Citation

  • Liping Guo & Sarah Miller & Wenjie Zhou & Zhipeng Wei & Junjie Ren & Xinyu Huang & Xin Xing & Howard White & Kehu Yang, 2025. "Critical appraisal of methodological quality and completeness of reporting in Chinese social science systematic reviews with meta‐analysis: A systematic review," Campbell Systematic Reviews, John Wiley & Sons, vol. 21(1), March.
  • Handle: RePEc:wly:camsys:v:21:y:2025:i:1:n:e70014
    DOI: 10.1002/cl2.70014
    as

    Download full text from publisher

    File URL: https://doi.org/10.1002/cl2.70014
    Download Restriction: no

    File URL: https://libkey.io/10.1002/cl2.70014?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Rokiah Mamikutty & Ameera Syafiqah Aly & Jamaludin Marhazlinda, 2021. "Selecting Risk of Bias Tools for Observational Studies for a Systematic Review of Anthropometric Measurements and Dental Caries among Children," IJERPH, MDPI, vol. 18(16), pages 1-21, August.
    2. Xiaoqin Wang & Vivian Welch & Meixuan Li & Liang Yao & Julia Littell & Huijuan Li & Nan Yang & Jianjian Wang & Larissa Shamseer & Yaolong Chen & Kehu Yang & Jeremy M. Grimshaw, 2021. "The methodological and reporting characteristics of Campbell reviews: A systematic review," Campbell Systematic Reviews, John Wiley & Sons, vol. 17(1), March.
    3. Birte Snilstveit, 2012. "Systematic reviews: from ‘bare bones’ reviews to policy relevance," Journal of Development Effectiveness, Taylor & Francis Journals, vol. 4(3), pages 388-408, September.
    4. Mike Thelwall & Subreena Simrick & Ian Viney & Peter Van den Besselaar, 2023. "What is research funding, how does it influence research, and how is it recorded? Key dimensions of variation," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(11), pages 6085-6106, November.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Constanza Gonzalez Parrao & Marta Moratti & Shannon Shisler & Birte Snilstveit & John Eyers, 2021. "PROTOCOL: Aquaculture for improving productivity, income, nutrition and women's empowerment in low‐ and middle‐income countries: A systematic review and meta‐analysis," Campbell Systematic Reviews, John Wiley & Sons, vol. 17(3), September.
    2. Bei Pan & Long Ge & Xiaoman Wang & Ning Ma & Zhipeng Wei & Lai Honghao & Liangying Hou & Kehu Yang, 2024. "Assessment of publication time in Campbell Systematic Reviews: A cross‐sectional survey," Campbell Systematic Reviews, John Wiley & Sons, vol. 20(4), December.
    3. Laura Cruz-Castro & Clara Casado & Luis Sanz-Menéndez, 2025. "Merit, competition and gender: scientific promotion in public research organisations," Palgrave Communications, Palgrave Macmillan, vol. 12(1), pages 1-16, December.
    4. Liping Guo & Wenjie Zhou & Xin Xing & Zhipeng Wei & Minyan Yang & Mina Ma & Kehu Yang & Howard White, 2022. "PROTOCOL: Critical appraisal of methodological quality and reporting items of systematic reviews with meta‐analysis in evidence‐based social science in China: A systematic review," Campbell Systematic Reviews, John Wiley & Sons, vol. 18(4), December.
    5. Bei Pan & Long Ge & Zhipeng Wei & Liangying Hou & Honghao Lai & Kehu Yang, 2023. "PROTOCOL: Assessment of publication time in Campbell systematic reviews: A cross‐sectional survey," Campbell Systematic Reviews, John Wiley & Sons, vol. 19(1), March.
    6. Sarah Young & Alison Bethel & Ciara Keenan & Kate Ghezzi‐Kopel & Elizabeth Moreton & David Pickup & Zahra A. Premji & Morwenna Rogers & Bjørn C. A. Viinholt, 2021. "PROTOCOL: Searching and reporting in Campbell Collaboration systematic reviews: An assessment of current methods," Campbell Systematic Reviews, John Wiley & Sons, vol. 17(4), December.
    7. Yanfei Li & Omar Dewidar & Xiaoqin Wang & Elizabeth Ghogomu & Arpana Wadhwani & Ke Guo & Mina Ma & Victoria Barbeau & Bei Pan & Leenah Abdelrazeq & Zijun Li & Amjad Alghamyan & Liping Guo & Fatima Jah, 2023. "Methodological quality of Campbell Systematic Reviews has improved over the past decade," Campbell Systematic Reviews, John Wiley & Sons, vol. 19(4), December.
    8. repec:wly:camsys:v:10:y:2014:i:1:p:1-46:a is not listed on IDEAS
    9. Ariel M. Aloe & Ruth Garside, 2021. "Editorial: Types of methods research papers in the journal Campbell Systematic Reviews," Campbell Systematic Reviews, John Wiley & Sons, vol. 17(2), June.
    10. Sarah Young & Heather MacDonald & Diana Louden & Ursula M. Ellis & Zahra Premji & Morwenna Rogers & Alison Bethel & David Pickup, 2024. "Searching and reporting in Campbell Collaboration systematic reviews: A systematic assessment of current methods," Campbell Systematic Reviews, John Wiley & Sons, vol. 20(3), September.
    11. Vivian A. Welch, 2021. "Campbell Collaboration: Reflection on growth and cultivation from 2017 to 2021," Campbell Systematic Reviews, John Wiley & Sons, vol. 17(4), December.
    12. Natalie Rebelo Da Silva & Hazel Zaranyika & Laurenz Langer & Nicola Randall & Evans Muchiri & Ruth Stewart, 2017. "Making the Most of What We Already Know," Evaluation Review, , vol. 41(2), pages 155-172, April.
    13. Du, Anbang & Head, Michael & Brede, Markus, 2025. "Integration vs segregation: Network analysis of interdisciplinarity in funded and unfunded research on infectious diseases," Journal of Informetrics, Elsevier, vol. 19(1).
    14. Sara Stevano & Suneetha Kadiyala & Deborah Johnston & Hazel Malapit & Elizabeth Hull & Sofia Kalamatianou, 2019. "Time-Use Analytics: An Improved Way of Understanding Gendered Agriculture-Nutrition Pathways," Feminist Economics, Taylor & Francis Journals, vol. 25(3), pages 1-22, July.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:wly:camsys:v:21:y:2025:i:1:n:e70014. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Wiley Content Delivery (email available below). General contact details of provider: https://doi.org/10.1111/(ISSN)1891-1803 .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.