IDEAS home Printed from https://ideas.repec.org/a/spr/jmgtco/v33y2022i4d10.1007_s00187-022-00347-6.html
   My bibliography  Save this article

Effects of algorithmic control on power asymmetry and inequality within organizations

Author

Listed:
  • Mehdi Barati

    (University at Albany-State University of New York)

  • Bahareh Ansari

    (University at Albany)

Abstract

Algorithmic control is expanding in various domains with the advances in programming algorithms, the continuous increase in hardware computing power, larger amounts of available fine-grained data, and an increasing number of organizations exercising remote work. Scholars and practitioners in human resource management posit that organizations’ adoption of algorithms as a substitute for or supplement to traditional rational control mechanisms to direct, discipline, and evaluate workers might increase the objectivity and transparency of worker-related decision-making processes and, therefore, reduce the power asymmetry and inequality within organizations. This discussion commentary argues that the underlying assumptions of the higher objectivity and transparency of algorithms in organizational control are very strong, and current evidence does not support them. There is also evidence of large variation in organizations’ adoption of algorithmic control due to their current technical, structural, and human capital resources, which further blurs the predicted outcomes. Evidence also exists for an over-reliance on algorithmic suggestions by managers to circumvent accountability. Adopting algorithmic control must therefore be conducted with serious precautions. This article proposes that overestimation of objectivity and transparency, and large variation in organizations’ adoption of AC (including the lack of technical and managerial knowledge of the underlying mechanisms of learning algorithms in some organizations, and the complete abandonment of human intuitive judgment and reasoning in others) could worsen the power asymmetry and inequality within organizations by increasing the opacity of decisions, systematic biases, discriminatory classification, and violation of worker privacy.

Suggested Citation

  • Mehdi Barati & Bahareh Ansari, 2022. "Effects of algorithmic control on power asymmetry and inequality within organizations," Journal of Management Control: Zeitschrift für Planung und Unternehmenssteuerung, Springer, vol. 33(4), pages 525-544, December.
  • Handle: RePEc:spr:jmgtco:v:33:y:2022:i:4:d:10.1007_s00187-022-00347-6
    DOI: 10.1007/s00187-022-00347-6
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s00187-022-00347-6
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s00187-022-00347-6?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to

    for a different version of it.

    References listed on IDEAS

    as
    1. Alina Köchling & Marius Claus Wehner, 2020. "Discriminated by an algorithm: a systematic review of discrimination and fairness by algorithmic decision-making in the context of HR recruitment and HR development," Business Research, Springer;German Academic Association for Business Research, vol. 13(3), pages 795-848, November.
    2. Edwards, Lilian & Veale, Michael, 2017. "Slave to the Algorithm? Why a 'right to an explanation' is probably not the remedy you are looking for," LawRxiv 97upg, Center for Open Science.
    3. Wiener, Martin & Cram, W. Alec & Benlian, Alexander, 2023. "Algorithmic control and gig workers: A legitimacy perspective of Uber drivers," Publications of Darmstadt Technical University, Institute for Business Studies (BWL) 128415, Darmstadt Technical University, Department of Business Administration, Economics and Law, Institute for Business Studies (BWL).
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Niilo Noponen & Polina Feshchenko & Tommi Auvinen & Vilma Luoma-aho & Pekka Abrahamsson, 2024. "Taylorism on steroids or enabling autonomy? A systematic review of algorithmic management," Management Review Quarterly, Springer, vol. 74(3), pages 1695-1721, September.
    2. Ernest Kumi & George Kofi Amoako & Thomas Appiah & Kwasi Dartey-Baah, 2025. "The impact of digital transformation on organisational dynamics, HR practices, and wellbeing in Ghana's healthcare sector: a social exchange perspective," Future Business Journal, Springer, vol. 11(1), pages 1-25, December.
    3. Ayaz, Ozlem & Tabaghdehi, Seyedeh Asieh Hosseini & Rosli, Ainurul & Tambay, Prerna, 2025. "Ethical implications of employee and customer digital footprint: SMEs perspective," Journal of Business Research, Elsevier, vol. 188(C).
    4. Kekez, Ivan & Lauwaert, Lode & Begičević Ređep, Nina, 2025. "Is artificial intelligence (AI) research biased and conceptually vague? A systematic review of research on bias and discrimination in the context of using AI in human resource management," Technology in Society, Elsevier, vol. 81(C).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. König, Pascal D. & Wenzelburger, Georg, 2021. "The legitimacy gap of algorithmic decision-making in the public sector: Why it arises and how to address it," Technology in Society, Elsevier, vol. 67(C).
    2. Yang, Yihao & Chi, Ming & Bi, Xinhua & Xu, Yongshun, 2025. "Dark sides of algorithmic control in app-based gig work: An objectification perspective," Journal of Business Research, Elsevier, vol. 195(C).
    3. Suen, Hung-Yue & Hung, Kuo-En, 2024. "Revealing the influence of AI and its interfaces on job candidates' honest and deceptive impression management in asynchronous video interviews," Technological Forecasting and Social Change, Elsevier, vol. 198(C).
    4. Hazel Si Min Lim & Araz Taeihagh, 2019. "Algorithmic Decision-Making in AVs: Understanding Ethical and Technical Concerns for Smart Cities," Sustainability, MDPI, vol. 11(20), pages 1-28, October.
    5. Chenfeng Yan & Quan Chen & Xinyue Zhou & Xin Dai & Zhilin Yang, 2024. "When the Automated fire Backfires: The Adoption of Algorithm-based HR Decision-making Could Induce Consumer’s Unfavorable Ethicality Inferences of the Company," Journal of Business Ethics, Springer, vol. 190(4), pages 841-859, April.
    6. van de Kerkhof, Jacob, 2025. "Article 22 Digital Services Act: Building trust with trusted flaggers," Internet Policy Review: Journal on Internet Regulation, Alexander von Humboldt Institute for Internet and Society (HIIG), Berlin, vol. 14(1), pages 1-26.
    7. Buhmann, Alexander & Fieseler, Christian, 2021. "Towards a deliberative framework for responsible innovation in artificial intelligence," Technology in Society, Elsevier, vol. 64(C).
    8. Colak Murat & Saridogan Berkay C., 2023. "Exploring the Remote Work Revolution: A Managerial View of the Tech Sector’s Response to the New Normal," International Journal of Contemporary Management, Sciendo, vol. 59(4), pages 18-33, December.
    9. Mahmoud Abdulhadi Alabdali & Sami A. Khan & Muhammad Zafar Yaqub & Mohammed Awad Alshahrani, 2024. "Harnessing the Power of Algorithmic Human Resource Management and Human Resource Strategic Decision-Making for Achieving Organizational Success: An Empirical Analysis," Sustainability, MDPI, vol. 16(11), pages 1-30, June.
    10. Cobbe, Jennifer & Veale, Michael & Singh, Jatinder, 2023. "Understanding Accountability in Algorithmic Supply Chains," SocArXiv p4sey, Center for Open Science.
    11. Zhang, Lixuan & Yencha, Christopher, 2022. "Examining perceptions towards hiring algorithms," Technology in Society, Elsevier, vol. 68(C).
    12. Kirsten Martin & Ari Waldman, 2023. "Are Algorithmic Decisions Legitimate? The Effect of Process and Outcomes on Perceptions of Legitimacy of AI Decisions," Journal of Business Ethics, Springer, vol. 183(3), pages 653-670, March.
    13. Veale, Michael & Binns, Reuben & Van Kleek, Max, 2018. "Some HCI Priorities for GDPR-Compliant Machine Learning," LawArchive wm6yk_v1, Center for Open Science.
    14. Vesnic-Alujevic, Lucia & Nascimento, Susana & Pólvora, Alexandre, 2020. "Societal and ethical impacts of artificial intelligence: Critical notes on European policy frameworks," Telecommunications Policy, Elsevier, vol. 44(6).
    15. Veale, Michael, 2017. "Logics and practices of transparency and opacity in real-world applications of public sector machine learning," SocArXiv 6cdhe, Center for Open Science.
    16. Trautwein, Yannik & Zechiel, Felix & Coussement, Kristof & Meire, Matthijs & Büttgen, Marion, 2025. "Opening the ‘black box’ of HRM algorithmic biases – How hiring practices induce discrimination on freelancing platforms," Journal of Business Research, Elsevier, vol. 192(C).
    17. Söderlund, Kasia & Engström, Emma & Haresamudram, Kashyap & Larsson, Stefan & Strimling, Pontus, 2024. "Regulating high-reach AI: On transparency directions in the Digital Services Act," Internet Policy Review: Journal on Internet Regulation, Alexander von Humboldt Institute for Internet and Society (HIIG), Berlin, vol. 13(1), pages 1-31.
    18. Mazur Joanna, 2019. "Automated Decision-Making and the Precautionary Principle in EU Law," TalTech Journal of European Studies, Sciendo, vol. 9(4), pages 3-18, December.
    19. Daniela Sele & Marina Chugunova, 2023. "Putting a Human in the Loop: Increasing Uptake, but Decreasing Accuracy of Automated Decision-Making," Rationality and Competition Discussion Paper Series 438, CRC TRR 190 Rationality and Competition.
    20. Frederik Zuiderveen Borgesius & Joost Poort, 2017. "Online Price Discrimination and EU Data Privacy Law," Journal of Consumer Policy, Springer, vol. 40(3), pages 347-366, September.

    More about this item

    Keywords

    ;
    ;
    ;
    ;
    ;
    ;

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:jmgtco:v:33:y:2022:i:4:d:10.1007_s00187-022-00347-6. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.