Author
Listed:
- Tontrup, Stephan
- Sprigman, Christopher Jon
Abstract
Our study examines how individuals perceive the moral agency of artificial intelligence (AI), and, specifically, whether individuals believe that by involving AI as their agent, they can offload to the AI some of their responsibility for a morally sensitive decision. Existing literature shows that people often delegate self-interested decisions to human agents to mitigate their moral responsibility for unethical outcomes. This research explores whether individuals will similarly delegate such decisions to AI to reduce moral costs. Our study shows that many individuals perceive the AI as capable of assuming moral responsibility. These individuals delegate to the AI and delegating leads them to act more assertively in their self-interest while experiencing lower moral costs. Participants (hereinafter, "Allocators") took part in a dictator game, allocating a $10 endowment between themselves and a Recipient. In the experimental treatment, Allocators could involve ChatGPT in their allocation decision, at the cost of incurring added time to complete the experiment. When engaged, the AI executed the transfer by informing the Recipient of a necessary payment code. Around 35% of Allocators chose to involve the AI, despite the opportunity costs of a much-prolonged process. To isolate the effect of the AI's perceived responsibility, a control condition replaced the AI with a non-agentive computer program, while maintaining identical decision protocols. This design controlled for factors such as social distance and substantive influence by the AI. Allocators who involved the AI transferred significantly less money to the Recipient, suggesting that delegating the transfer to AI reduced the moral costs associated with self-interested decisions. This is supported by the fact that prosocial individuals, who face higher moral costs from violating a norm and thus would without delegation transfer more than proself individuals, were significantly more likely to involve the AI. A responsibility measure indicates that Allocators who attributed more responsibility for the transfer to the AI were also more likely to involve the AI. The study suggests that AI systems provide human actors with an easily accessible, low-cost, and hard-to-monitor means of offloading personal moral responsibility, highlighting the need to consider in AI regulation not only the inherent risks of AI output, but also how AI's perceived moral agency can influence human behavior and ethical accountability in human-AI interaction.
Suggested Citation
Tontrup, Stephan & Sprigman, Christopher Jon, 2025.
"Strategic Delegation of Moral Decisions to AI,"
EconStor Preprints
335206, ZBW - Leibniz Information Centre for Economics.
Handle:
RePEc:zbw:esprep:335206
DOI: 10.2139/ssrn.5696827
Download full text from publisher
More about this item
Keywords
;
;
;
;
JEL classification:
- C91 - Mathematical and Quantitative Methods - - Design of Experiments - - - Laboratory, Individual Behavior
- D91 - Microeconomics - - Micro-Based Behavioral Economics - - - Role and Effects of Psychological, Emotional, Social, and Cognitive Factors on Decision Making
- O33 - Economic Development, Innovation, Technological Change, and Growth - - Innovation; Research and Development; Technological Change; Intellectual Property Rights - - - Technological Change: Choices and Consequences; Diffusion Processes
- D63 - Microeconomics - - Welfare Economics - - - Equity, Justice, Inequality, and Other Normative Criteria and Measurement
Statistics
Access and download statistics
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:zbw:esprep:335206. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: ZBW - Leibniz Information Centre for Economics (email available below). General contact details of provider: https://edirc.repec.org/data/zbwkide.html .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.