Author
Listed:
- Elina H. Hwang
(Michael G. Foster School of Business, University of Washington, Seattle, Washington 98195)
- Stephanie Lee
(Michael G. Foster School of Business, University of Washington, Seattle, Washington 98195)
Abstract
Fueled by social media, health misinformation is spreading rapidly across online platforms. Myths, rumors, and false information on vaccines are flourishing, and the aftermath can be disastrous. A more concerning trend is that people are increasingly relying on social media to obtain healthcare information and tending to believe what they read on social media. Given the serious consequences of misinformation, this study aims to explore the efficacy of a potential cure for the infodemic we face. Specifically, we focus on a countermeasure that Twitter used, which is to nudge users toward credible information when users search topics for which erroneous information is rampant. This Twitter’s policy is unique, in that the intervention is not about censorship but about redirecting users away from false information and toward facts. Our analysis uses 1,468 news articles that contain misinformation about health topics such as measles, vaccines, and cancer. Our analysis reveals that Twitter’s nudging policy reduces misinformation diffusion. After the policy introduction, a news article that contains misinformation is less likely to start a diffusion process on Twitter. In addition, tweets that contain a link to misinformation articles are less likely to be retweeted, quoted, or replied to, which leads to a significant reduction in the aggregated number of tweets each misinformation article attracts. We further uncover that the observed reduction is driven by the decrease both in original tweet posts—those that first introduce misinformation news articles to the Twitter platform—and in those resharing the misinformation, although the reduction is more significant in resharing posts. Last, we find that the effect is driven primarily by a decrease in human-like accounts that share links to unverified claims but not by a decrease in activities by bot-like accounts. Our findings suggest that a misinformation policy that relies on a nudge to a credible source rather than on censorship can suppress misinformation diffusion.
Suggested Citation
Elina H. Hwang & Stephanie Lee, 2025.
"A Nudge to Credible Information as a Countermeasure to Misinformation: Evidence from Twitter,"
Information Systems Research, INFORMS, vol. 36(1), pages 621-636, March.
Handle:
RePEc:inm:orisre:v:36:y:2025:i:1:p:621-636
DOI: 10.1287/isre.2021.0491
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:inm:orisre:v:36:y:2025:i:1:p:621-636. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Chris Asher (email available below). General contact details of provider: https://edirc.repec.org/data/inforea.html .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.