Author
Listed:
- Bo Huang
(University of Nottingham Ningbo [China])
- Sandra Laporte
(TSM - Toulouse School of Management Research - UT Capitole - Université Toulouse Capitole - UT - Université de Toulouse - CNRS - Centre National de la Recherche Scientifique - TSM - Toulouse School of Management - UT Capitole - Université Toulouse Capitole - UT - Université de Toulouse)
- Sylvain Sénécal
(HEC Montréal - HEC Montréal)
- Kamila Sobol
(John Molson School of Business - Concordia University [Montreal])
Abstract
The swift integration of artificial intelligence (AI)-driven tools in various industries, such as virtual assistants, chatbots, and service robots, raises inquiries about consumer reactions to these emerging technologies. To promote acceptance and enhance service interactions, companies frequently market these technologies by fostering parasocial and anthropomorphic relationships: the roles of partner and servant are among the most prevalent. Yet, the precise influence these relationship roles have on consumer responses remains uncertain. While extant literature primarily shows a positive effect of treating AI as a partner, in the current research, we find a multifaceted adverse effect of anthropomorphic partner (versus servant) relationships in the context of service failure. Across four studies, the results demonstrate that when consumers perceive an AI assistant as a relational partner, it heightens their inclination to attribute the failure to themselves because of elevated self-expansion perceptions with the AI. Furthermore, within this relationship dynamic, users exhibit reduced intentions of utilizing the AI agent again, as a result of a decreased sense of self-efficacy. Finally, the undesirable effects of a partner relationship following a service failure can be mitigated by drawing attention to the AI's learning capabilites. The findings of our research highlight a potential caveat of an AI-as-partner relationship, thus advancing our understanding of consumer interaction with AI from a relational perspective.
Suggested Citation
Bo Huang & Sandra Laporte & Sylvain Sénécal & Kamila Sobol, 2025.
"Falling out with AI-buddies: The hidden costs of treating AI as a partner versus servant during service failure,"
Post-Print
hal-05228448, HAL.
Handle:
RePEc:hal:journl:hal-05228448
DOI: 10.1016/j.techfore.2025.124279
Download full text from publisher
To our knowledge, this item is not available for
download. To find whether it is available, there are three
options:
1. Check below whether another version of this item is available online.
2. Check on the provider's
web page
whether it is in fact available.
3. Perform a
for a similarly titled item that would be
available.
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:hal:journl:hal-05228448. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: CCSD (email available below). General contact details of provider: https://hal.archives-ouvertes.fr/ .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.