Author
Listed:
- Byung-Jik Kim
(University of Ulsan)
- Min-Jik Kim
(Korea University of Technology and Education)
- Julak Lee
(Chung-Ang University, Department of Industrial Security)
Abstract
Artificial intelligence (AI) is increasingly being integrated into business practices, fundamentally altering workplace dynamics and employee experiences. While the adoption of AI brings numerous benefits, it also introduces negative aspects that may adversely affect employee well-being, including psychological distress and depression. Drawing upon a range of theoretical perspectives, this study examines the association between organizational AI adoption and employee depression, investigating how psychological safety mediates this relationship and how ethical leadership serves as a moderating factor. Utilizing an online survey platform, we conducted a 3-wave time-lagged research study involving 381 employees from South Korean companies. Structural equation modeling analysis revealed that AI adoption has a significant negative impact on psychological safety, which in turn increases levels of depression. Data were analyzed using SPSS 28 for preliminary analyses and AMOS 28 for structural equation modeling with maximum likelihood estimation. Further analysis using bootstrapping methods confirmed that psychological safety mediates the relationship between AI adoption and employee depression. The study also discovered that ethical leadership can mitigate the adverse effects of AI adoption on psychological safety by moderating the relationship between these variables. These findings highlight the critical importance of fostering a psychologically safe work environment and promoting ethical leadership practices to protect employee well-being amid rapid technological advancements. Contributing to the growing body of literature on the psychological effects of AI adoption in the workplace, this research offers valuable insights for organizations seeking to address the human implications of AI integration. The section discusses the practical and theoretical implications of the results and suggests potential directions for future research.
Suggested Citation
Byung-Jik Kim & Min-Jik Kim & Julak Lee, 2025.
"The dark side of artificial intelligence adoption: linking artificial intelligence adoption to employee depression via psychological safety and ethical leadership,"
Palgrave Communications, Palgrave Macmillan, vol. 12(1), pages 1-14, December.
Handle:
RePEc:pal:palcom:v:12:y:2025:i:1:d:10.1057_s41599-025-05040-2
DOI: 10.1057/s41599-025-05040-2
Download full text from publisher
As the access to this document is restricted, you may want to search for a different version of it.
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:pal:palcom:v:12:y:2025:i:1:d:10.1057_s41599-025-05040-2. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: https://www.nature.com/ .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.