Vorbereitung auf die EU-Lohntransparenzrichtlinie | Laden Sie unser E-Book kostenlos herunter
Using People Analytics Tools To Increase Pay Equity: A Four-Step Bias-Aware Process
People analytics tools have the potential to improve pay equity, yet they can also deepen existing bias. This article outlines a four-step process to help organizations avoid this pitfall and become bias-aware.
Becoming Bias-Aware
Employers who want to increase pay equity within their companies are increasingly turning to people analytics tools: software to optimize the employee value chain, powered by sophisticated algorithms. These tools offer real promise – but they can also reinforce a harmful status quo.
For example, Amazon once implemented a resume screener that de-prioritized female tech job applicants. Algorithms use data about past decisions to make present decisions. So because most of Amazon’s tech employees had been male, the algorithm “thought” that maleness = success.
Even if someone tries to make an algorithm “gender-blind” or “race-blind” by excluding gender/race variables, bias can creep in through other variables. For example, in many areas, an employee’s postal code can imply their race.
So to harness people analytics to improve pay equity, the key is developing or choosing tools that can handle any potential bias in the data. Then we have to use the tools in a bias-aware way. In other words, we need to apply human judgment to compensate and adjust for potential bias.
Four Steps to Bias-Awareness
A bias-aware data-driven decision-making process has four key steps. It can be used by companies designing their own people analytics tools or those looking for an external solution.
1. Locate and mitigate bias in the data
Some questions to consider are:
Who does the data include/not include? Does it reflect all demographic groups?
Does the data use any features that are surrogates for underlying qualities? (Like using college GPA to represent intelligence.)
Is there anything the data doesn’t capture? (For example, an employee’s personality can affect job performance.)
For external tools: How does the application react to biased data? What factors does it consider when analyzing bias?
2. Locate and correct bias in the tool’s algorithmic modeling
It is important to ensure the tool does not harm minority groups while working well for majority groups. A bias dashboard can be helpful here, as can third-party bias evaluation tools like FairLearn or AI Fairness360. If there is bias in the model, developers can figure out where it is and then correct for it. (For instance, if the model gives female employees 23% lower chances of promotion, it can add 23% to female employees’ scores.)
For external tools: How does the tool test for and address bias? How will it be retrained to fit the company’s employees?
3. Understand how the tool makes its recommendations – and how they’ll be monitored
Before implementing any tool, managers should decide on the metrics for evaluating its performance and determining whether it is improving pay equity.
For external tools: Is the tool transparent about what drives its decisions? How will those decisions be monitored?
4. In making the final decision, managers should consider the tool’s recommendation – and then carefully incorporate any factors the tool may not capture
Note that this process includes a human in the loop at each stage. We can’t simply leave a people analytics tool to its own devices and trust it to do the right thing. But by choosing or designing tools thoughtfully, tailoring them to correct for bias, and monitoring them carefully, managers can use people analytics to genuinely improve equity.