Accenture unveils tool to help firms eliminate ‘bias’

Bloomberg

Consulting firm Accenture has a new tool to help businesses detect and eliminate gender, racial and ethnic bias in artificial intelligence software.
Companies and governments are increasingly turning to machine-learning algorit-hms to help make critical decisions, including who to hire, who gets insurance or a mortgage, who receives government benefits and even whether to grant a prisoner parole.
One of the arguments for using such software is that,
if correctly designed and trained, it can potentially make decisions free from the prejudices that often impact human choices.
But, in a number of well-publicized examples, algorithms have been found to discriminate against minorities and women. For instance, an algorithm many US cities and states used to help make bail decisions was twice as likely to falsely label black prisoners as being at high-risk for re-offending as white prisoners, according to a 2016
investigation by ProPublica.
Such cases have raised awareness about the dangers of biased algorithms, but companies have struggled to respond. “Our clients are telling us they are not equipped to thinking about the economic, social and political outcomes of their algorithms and are coming to us for help with checks and balances,” Rumman Chowdhury, a data scientist who leads an area of Accenture’s business called Responsible AI.

Leave a Reply

Send this to a friend