IDEAS home Printed from https://ideas.repec.org/a/eee/teinso/v80y2025ics0160791x24003087.html
   My bibliography  Save this article

Fact-checking in the age of AI: Reducing biases with non-human information sources

Author

Listed:
  • Moon, Won-Ki
  • Kahlor, Lee Ann

Abstract

This study examines the obstacles to the effectiveness of fact-checking, focusing primarily on the pervasive impact of entrenched biases. Fact-checking efforts often face resistance when linked to mistrusted sources, leading to cognitive dissonance and the rejection of messages in favor of pre-existing beliefs, a phenomenon known as motivated reasoning. This resistance hinders organizations’ ability to correct misconceptions surrounding social issues and entities. The research delves into whether non-human entities such as AI can facilitate less biased information processing due to their perceived impartiality. Applying a moderated mediation model in experimental settings, we found that labeling a source as artificial intelligence is pivotal in evaluating fact-checking. AI labels moderate the impact of partisan biases on the persuasive outcomes of fact-checks, such as message credibility and acceptance, compared to the human source. This study offers valuable insights for enhancing the effectiveness of fact-checking in the context of cognitive and psychological biases by highlighting the critical influence of information sources in reducing polarization in public perceptions of scientific issues.

Suggested Citation

  • Moon, Won-Ki & Kahlor, Lee Ann, 2025. "Fact-checking in the age of AI: Reducing biases with non-human information sources," Technology in Society, Elsevier, vol. 80(C).
  • Handle: RePEc:eee:teinso:v:80:y:2025:i:c:s0160791x24003087
    DOI: 10.1016/j.techsoc.2024.102760
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0160791X24003087
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.techsoc.2024.102760?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Dan M. Kahan & Ellen Peters & Maggie Wittlin & Paul Slovic & Lisa Larrimore Ouellette & Donald Braman & Gregory Mandel, 2012. "The polarizing impact of science literacy and numeracy on perceived climate change risks," Nature Climate Change, Nature, vol. 2(10), pages 732-735, October.
    2. Nicole M. Krause & Isabelle Freiling & Becca Beets & Dominique Brossard, 2020. "Fact-checking as risk communication: the multi-layered risk of misinformation in times of COVID-19," Journal of Risk Research, Taylor & Francis Journals, vol. 23(7-8), pages 1052-1059, August.
    3. Charles S. Taber & Milton Lodge, 2006. "Motivated Skepticism in the Evaluation of Political Beliefs," American Journal of Political Science, John Wiley & Sons, vol. 50(3), pages 755-769, July.
    4. Dietram A. Scheufele & Nicole M. Krause, 2019. "Science audiences, misinformation, and fake news," Proceedings of the National Academy of Sciences, Proceedings of the National Academy of Sciences, vol. 116(16), pages 7662-7669, April.
    5. Richard Van Noorden & Jeffrey M. Perkel, 2023. "AI and science: what 1,600 researchers think," Nature, Nature, vol. 621(7980), pages 672-675, September.
    6. Shanto Iyengar & Douglas S. Massey, 2019. "Scientific communication in a post-truth society," Proceedings of the National Academy of Sciences, Proceedings of the National Academy of Sciences, vol. 116(16), pages 7656-7661, April.
    7. Shanto Iyengar & Sean J. Westwood, 2015. "Fear and Loathing Across Party Lines: New Evidence on Group Polarization," American Journal of Political Science, John Wiley & Sons, vol. 59(3), pages 690-707, July.
    8. Katherine A. McComas & Craig W. Trumbo, 2001. "Source Credibility in Environmental Health – Risk Controversies: Application of Meyer's Credibility Index," Risk Analysis, John Wiley & Sons, vol. 21(3), pages 467-480, June.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Ya Yang & Lichao Xiu & Xuejiao Chen & Guoming Yu, 2023. "Do emotions conquer facts? A CCME model for the impact of emotional information on implicit attitudes in the post-truth era," Palgrave Communications, Palgrave Macmillan, vol. 10(1), pages 1-7, December.
    2. Bertoli, Paola & Grembi, Veronica & Morelli, Massimo & Rosso, Anna Cecilia, 2023. "In medio stat virtus? Effective communication and preferences for redistribution in hard times," Journal of Economic Behavior & Organization, Elsevier, vol. 214(C), pages 105-147.
    3. Soojong Kim, 2019. "Directionality of information flow and echoes without chambers," PLOS ONE, Public Library of Science, vol. 14(5), pages 1-22, May.
    4. Carisa Bergner & Bruce A. Desmarais & John Hird, 2019. "Speaking truth in power: Scientific evidence as motivation for policy activism," Journal of Behavioral Public Administration, Center for Experimental and Behavioral Public Administration, vol. 2(1).
    5. Golman, Russell, 2023. "Acceptable discourse: Social norms of beliefs and opinions," European Economic Review, Elsevier, vol. 160(C).
    6. Dieter Dekeyser & Henk Roose, 2022. "Polarizing policy opinions with conflict framed information: activating negative views of political parties in a multi-party system," Quality & Quantity: International Journal of Methodology, Springer, vol. 56(3), pages 1121-1138, June.
    7. Ester Faia & Andreas Fuster & Vincenzo Pezone & Basit Zafar, 2024. "Biases in Information Selection and Processing: Survey Evidence from the Pandemic," The Review of Economics and Statistics, MIT Press, vol. 106(3), pages 829-847, May.
    8. Laura N. Rickard, 2021. "Pragmatic and (or) Constitutive? On the Foundations of Contemporary Risk Communication Research," Risk Analysis, John Wiley & Sons, vol. 41(3), pages 466-479, March.
    9. Toby Bolsen & James N. Druckman & Fay Lomax Cook, 2015. "Citizens’, Scientists’, and Policy Advisors’ Beliefs about Global Warming," The ANNALS of the American Academy of Political and Social Science, , vol. 658(1), pages 271-295, March.
    10. Shuyuan Yu & John E Opfer, 2024. "Cognitive support for political partisans’ understanding of policy data," PLOS ONE, Public Library of Science, vol. 19(10), pages 1-23, October.
    11. Abhishek Samantray & Paolo Pin, 2019. "Credibility of climate change denial in social media," Palgrave Communications, Palgrave Macmillan, vol. 5(1), pages 1-8, December.
    12. Nicole M. Krause & Isabelle Freiling & Dietram A. Scheufele, 2022. "The “Infodemic†Infodemic: Toward a More Nuanced Understanding of Truth-Claims and the Need for (Not) Combatting Misinformation," The ANNALS of the American Academy of Political and Social Science, , vol. 700(1), pages 112-123, March.
    13. Maciel, Marcelo V. & Martins, André C.R., 2020. "Ideologically motivated biases in a multiple issues opinion model," Physica A: Statistical Mechanics and its Applications, Elsevier, vol. 553(C).
    14. Michael Hannon, 2022. "Are knowledgeable voters better voters?," Politics, Philosophy & Economics, , vol. 21(1), pages 29-54, February.
    15. Michael Nicholas Stagnaro & Eran Amsalem, 2025. "Factual knowledge can reduce attitude polarization," Nature Communications, Nature, vol. 16(1), pages 1-10, December.
    16. Erik C. Nisbet & Kathryn E. Cooper & R. Kelly Garrett, 2015. "The Partisan Brain," The ANNALS of the American Academy of Political and Social Science, , vol. 658(1), pages 36-66, March.
    17. Mohamed Mostagir & James Siderius, 2022. "Learning in a Post-Truth World," Management Science, INFORMS, vol. 68(4), pages 2860-2868, April.
    18. Michael M. Lokshin & Hannon,Michael & Miguel Purroy & Ivan Torre, 2024. "Do More Informed Citizens Make Better Climate Policy Decisions ?," Policy Research Working Paper Series 10921, The World Bank.
    19. Lawrence C. Hamilton, 2018. "Self-assessed understanding of climate change," Climatic Change, Springer, vol. 151(2), pages 349-362, November.
    20. repec:cup:judgdm:v:8:y:2013:i:4:p:407-424 is not listed on IDEAS
    21. Shoots-Reinhard, Brittany & Goodwin, Raleigh & Bjälkebring, Pär & Markowitz, David M. & Silverstein, Michael C. & Peters, Ellen, 2021. "Ability-related political polarization in the COVID-19 pandemic," Intelligence, Elsevier, vol. 88(C).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:teinso:v:80:y:2025:i:c:s0160791x24003087. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: https://www.journals.elsevier.com/technology-in-society .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.