IDEAS home Printed from https://ideas.repec.org/a/eee/epplan/v68y2018icp157-165.html

Reported credibility techniques in higher education evaluation studies that use qualitative methods: A research synthesis

Author

Listed:
  • Liao, Hongjing
  • Hitchcock, John

Abstract

This synthesis study examined the reported use of credibility techniques in higher education evaluation articles that use qualitative methods. The sample included 118 articles published in six leading higher education evaluation journals from 2003 to 2012. Mixed methods approaches were used to identify key credibility techniques reported across the articles, document the frequency of these techniques, and describe their use and properties. Two broad sets of techniques were of interest: primary design techniques (i.e., basic), such as sampling/participant recruitment strategies, data collection methods, analytic details, and additional qualitative credibility techniques (e.g., member checking, negative case analyses, peer debriefing). The majority of evaluation articles reported use of primary techniques although there was wide variation in the amount of supporting detail; most of the articles did not describe the use of additional credibility techniques. This suggests that editors of evaluation journals should encourage the reporting of qualitative design details and authors should develop strategies yielding fuller methodological description.

Suggested Citation

  • Liao, Hongjing & Hitchcock, John, 2018. "Reported credibility techniques in higher education evaluation studies that use qualitative methods: A research synthesis," Evaluation and Program Planning, Elsevier, vol. 68(C), pages 157-165.
  • Handle: RePEc:eee:epplan:v:68:y:2018:i:c:p:157-165
    DOI: 10.1016/j.evalprogplan.2018.03.005
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0149718917302410
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.evalprogplan.2018.03.005?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to

    for a different version of it.

    References listed on IDEAS

    as
    1. Michèle Lamont & Grégoire Mallard & Joshua Guetzkow, 2006. "Beyond blind faith: overcoming the obstacles to interdisciplinary evaluation," Research Evaluation, Oxford University Press, vol. 15(1), pages 43-55, April.
    2. Ulrich Schmoch & Torben Schubert & Dorothea Jansen & Richard Heidler & Regina von Görtz, 2010. "How to use indicators to measure scientific performance: a balanced approach," Research Evaluation, Oxford University Press, vol. 19(1), pages 2-18, March.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Yang Yang & Liran Ma, 2025. "Artificial intelligence in qualitative analysis: a practical guide and reflections based on results from using GPT to analyze interview data in a substance use program," Quality & Quantity: International Journal of Methodology, Springer, vol. 59(3), pages 2511-2534, June.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Francesco Giovanni Avallone & Alberto Quagli & Paola Ramassa, 2022. "Interdisciplinary research by accounting scholars: An exploratory study," FINANCIAL REPORTING, FrancoAngeli Editore, vol. 2022(2), pages 5-34.
    2. Giulio Giacomo Cantone, 2024. "How to measure interdisciplinary research? A systemic design for the model of measurement," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(8), pages 4937-4982, August.
    3. Antonio Fernández-Cano & Manuel Torralbo & Mónica Vallejo, 2012. "Time series of scientific growth in Spanish doctoral theses (1848–2009)," Scientometrics, Springer;Akadémiai Kiadó, vol. 91(1), pages 15-36, April.
    4. repec:oup:rseval:v:32:y:2024:i:2:p:213-227. is not listed on IDEAS
    5. Gibson, Elizabeth & Daim, Tugrul U. & Dabic, Marina, 2019. "Evaluating university industry collaborative research centers," Technological Forecasting and Social Change, Elsevier, vol. 146(C), pages 181-202.
    6. Torben Schubert & Henning Kroll, 2016. "Universities’ effects on regional GDP and unemployment: The case of Germany," Papers in Regional Science, Wiley Blackwell, vol. 95(3), pages 467-489, August.
    7. Schubert , Torben, 2013. "Are there Scale Economies in Scientific Production? On the Topic of Locally Increasing Returns to Scale," Papers in Innovation Studies 2013/43, Lund University, CIRCLE - Centre for Innovation Research.
    8. Torben Schubert, 2014. "Are there scale economies in scientific production? On the topic of locally increasing returns to scale," Scientometrics, Springer;Akadémiai Kiadó, vol. 99(2), pages 393-408, May.
    9. Corey J A Bradshaw & Justin M Chalker & Stefani A Crabtree & Bart A Eijkelkamp & John A Long & Justine R Smith & Kate Trinajstic & Vera Weisbecker, 2021. "A fairer way to compare researchers at any career stage and in any discipline using open-access citation data," PLOS ONE, Public Library of Science, vol. 16(9), pages 1-15, September.
    10. Akshaya Kumar Biswal, 2013. "An Absolute Index (Ab-index) to Measure a Researcher’s Useful Contributions and Productivity," PLOS ONE, Public Library of Science, vol. 8(12), pages 1-10, December.
    11. Oviedo-García, M. Ángeles, 2016. "Tourism research quality: Reviewing and assessing interdisciplinarity," Tourism Management, Elsevier, vol. 52(C), pages 586-592.
    12. Tasso Brandt & Torben Schubert, 2014. "Is the university model an organizational necessity? Scale and agglomeration effects in science," Chapters, in: Andrea Bonaccorsi (ed.), Knowledge, Diversity and Performance in European Higher Education, chapter 8, pages iii-iii, Edward Elgar Publishing.
    13. Franc Mali, 2013. "Why an Unbiased External R&D Evaluation System is Important for the Progress of Social Sciences—the Case of a Small Social Science Community," Social Sciences, MDPI, vol. 2(4), pages 1-14, December.
    14. Pimentel, Erica & Cho, Charles & Bothello, Joel, 2022. "The blind spots of interdisciplinarity in addressing grand challenges," MPRA Paper 114562, University Library of Munich, Germany.
    15. Charlotte Rungius & Tim Flink, 2020. "Romancing science for global solutions: on narratives and interpretative schemas of science diplomacy," Humanities and Social Sciences Communications, Palgrave Macmillan, vol. 7(1), pages 1-10, December.
    16. Julian Hamann & Frerk Blome & Anna Kosmützky, 2022. "Devices of evaluation: Institutionalization and impact—Introduction to the special issue," Research Evaluation, Oxford University Press, vol. 31(4), pages 423-428.
    17. Nabil Amara & Réjean Landry, 2012. "Counting citations in the field of business and management: why use Google Scholar rather than the Web of Science," Scientometrics, Springer;Akadémiai Kiadó, vol. 93(3), pages 553-581, December.
    18. Tasso Brandt & Torben Schubert, 2013. "Is the university model an organizational necessity? Scale and agglomeration effects in science," Scientometrics, Springer;Akadémiai Kiadó, vol. 94(2), pages 541-565, February.
    19. Zeug, Walther & Bezama, Alberto & Thrän, Daniela, 2020. "Towards a holistic and integrated Life Cycle Sustainability Assessment of the bioeconomy: Background on concepts, visions and measurements," UFZ Discussion Papers 7/2020, Helmholtz Centre for Environmental Research (UFZ), Division of Social Sciences (ÖKUS).
    20. Lo, Jade Y. & Li, Haiyang, 2018. "In the eyes of the beholder: The effect of participant diversity on perceived merits of collaborative innovations," Research Policy, Elsevier, vol. 47(7), pages 1229-1242.
    21. Biancani, Susan & Dahlander, Linus & McFarland, Daniel A. & Smith, Sanne, 2018. "Superstars in the making? The broad effects of interdisciplinary centers," Research Policy, Elsevier, vol. 47(3), pages 543-557.

    More about this item

    Keywords

    ;
    ;
    ;

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:epplan:v:68:y:2018:i:c:p:157-165. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/evalprogplan .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.