IDEAS home Printed from https://ideas.repec.org/a/aza/airwa0/y2022v2i1p17-23.html
   My bibliography  Save this article

Using AI to minimise bias in an employee performance review

Author

Listed:
  • Melton, Liz

    (Strategic Partnerships Manager, USA)

  • Riewe, Grant

    (Chief Technology Officer, USA)

Abstract

Performance reviews are intended to be objective, but all humans experience bias. While many companies opt for group reviews as a way to de-bias and challenge the status quo, what is being said in those meetings, how those comments are said and the context for those remarks are just as important. At the same time, most people’s attention span is of shorter duration than a review and being promoted depends on what bosses remember about their direct reports, their subjective measure of employee success, and their ability to convince others that employee accomplishments are deserving of a reward. As a result of these compounding factors, meta-bias patterns emerge in company culture. Combine those limitations with the fact that reviews are often a breeding ground for subtle — and not-so-subtle — bias, and it begs the question: Why are we not using technology to help? With developments in natural language processing (NLP) and conversational AI (CAI), computers can identify biased phrases in real time. Although these technologies have a long way to go to match human nuance, we can at least flag problematic phrases during something as significant as performance reviews. And with the right inputs rooted in social science and normalised based on geography, contextual relationships and culture, we could be surfacing insidious bias throughout organisations. This paper examines how a future CAI tool could reduce bias and, eventually, teach people to re-evaluate and reframe their thinking. In a performance review setting, the system would flag problematic phrases as they are said, and committee heads would stop the conversation. The committee would then evaluate the comment, ask the presenter for further information, and only continue once there is sufficient clarity. Once the discussion concludes, the review cycle would continue until another phrase is identified. The system serves to be persistently aware throughout all conversations and highlight potential bias for everyone to learn from. Beyond pointing out biased phrases during a performance review, a combination of NLP and CAI can serve as a foundation for company-wide analytics. Organisations can track who is speaking in a majority of meetings, what was said, who challenges biased phrases, whether or not certain types of people are misrepresented in reviews more or less frequently, and so on. All this information gives a fundamentally new picture of what is happening inside a company, laying the groundwork for human resource (HR)-related metrics that individuals (and the company as a whole) can improve over time.

Suggested Citation

  • Melton, Liz & Riewe, Grant, 2022. "Using AI to minimise bias in an employee performance review," Journal of AI, Robotics & Workplace Automation, Henry Stewart Publications, vol. 2(1), pages 17-23, September.
  • Handle: RePEc:aza:airwa0:y:2022:v:2:i:1:p:17-23
    as

    Download full text from publisher

    File URL: https://hstalks.com/article/7358/download/
    Download Restriction: Requires a paid subscription for full access.

    File URL: https://hstalks.com/article/7358/
    Download Restriction: Requires a paid subscription for full access.
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    More about this item

    Keywords

    bias; bias detection tool; bias detection system; AI; performance evaluations; review process; performance reviews; performance review;
    All these keywords.

    JEL classification:

    • M15 - Business Administration and Business Economics; Marketing; Accounting; Personnel Economics - - Business Administration - - - IT Management
    • G2 - Financial Economics - - Financial Institutions and Services

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:aza:airwa0:y:2022:v:2:i:1:p:17-23. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Henry Stewart Talks (email available below). General contact details of provider: .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.