Log in or become a subscriber

This content requires HR Daily Premium membership. Log in below or sign up here.

Employers duty-bound to mitigate risk of automated bias

Ahead of Australia introducing regulation of AI in recruitment and HR decision-making, there are some steps employers can take to get on the front foot, an expert says.

Mitigating the risk of systemic bias in hiring and promotion processes is not the responsibility of developers alone, says organisational psychologist Dr Matthew Neale – a VP at employment screening company Criteria Corp, which submitted expert testimony when New York City was developing its automated employment decision tools (AEDT) law.

The New York City law, which took effect in July, imposes significant compliance obligations on employers, Neale says. Organisations there now need to inform job applicants when they're using AI systems for evaluation, in case individuals want to opt out; employers must also have an independent bias auditor examine their AI system and report on the extent to which it's giving different classifications to people based on race and gender; and publish the results. If, for example, their system is recommending 60% of male applicants but only 40% of female applicants, they need to disclose that on their careers website...

Log in or become a subscriber
Subscriber login

Having trouble using your subscription? Contact us for help or check our FAQ page here for answers to commonly asked questions.

HR Daily Premium membership

Sign up now for all the benefits of HR Daily Premium membership.

Join here to stay informed

HR Daily Premium members are Australia's best-informed HR leaders and practitioners when it comes to HR news, thought leadership, legal compliance and emerging trends. Unlock premium membership to receive:

Full access to our news library Breaking news updates each day Complimentary passes to all webinars Webcasts streaming on demand Q&A sessions on hot topics And much more