Giving Compass' Take:

• Edmund L. Andrews reports that automated speech recognition is about half as reliable when it comes to Black speakers compared to white speakers. 

• Failure to recognized Black speech can further marginalize disadvantaged groups. 

• Learn about three ways to combat AI bias


The technology behind the United States’ leading automated speech recognition systems makes twice as many errors when interpreting words spoken by African Americans as when interpreting the same words spoken by whites, according to the research.

While the study focused exclusively on disparities between black and white Americans, similar problems could affect people who speak with regional and non-native-English accents, the researchers conclude.

If not addressed, this translational imbalance could have serious consequences for people’s careers and even lives. Many companies now screen job applicants with automated online interviews that employ speech recognition. Courts use the technology to help transcribe hearings. For people who can’t use their hands, moreover, speech recognition is crucial for accessing computers.

Read the full article about automated voice recognition and Black speakers by Edmund L. Andrews at Futurity.