What is Giving Compass?
We connect donors to learning resources and ways to support community-led solutions. Learn more about us.
Giving Compass' Take:
• At NPR, Bobby Allyn recounts the racially biased arrest of Robert Williams, a Black man who was misidentified by facial recognition technology.
• How might the creators of facial recognition technology influence its tendency towards racial bias? How does this example of racist facial recognition technology reflect discriminatory tendencies within the criminal justice system?
• Learn more about how facial recognition technology misidentifies people of color.
Police in Detroit were trying to figure out who stole five watches from a Shinola retail store. Authorities say the thief took off with an estimated $3,800 worth of merchandise.
Investigators pulled a security video that had recorded the incident. Detectives zoomed in on the grainy footage and ran the person who appeared to be the suspect through facial recognition software.
A hit came back: Robert Julian-Borchak Williams, 42, of Farmington Hills, Mich., about 25 miles northwest of Detroit.
Robert Williams was led to an interrogation room, and police put three photos in front of him: Two photos taken from the surveillance camera in the store and a photo of Williams' state-issued driver's license.
"When I look at the picture of the guy, I just see a big Black guy. I don't see a resemblance," Williams said in an interview with NPR.
"[The detective] flips the third page over and says, 'So I guess the computer got it wrong, too.'" Williams said of the interrogation with detectives.
Williams was detained for 30 hours and then released on bail until a court hearing on the case, his lawyers say.
Academic and government studies have demonstrated that facial recognition systems misidentify people of color more often than white people.
One of the leading studies on bias in face recognition was conducted by Joy Buolamwini, an MIT researcher and founder of the Algorithmic Justice League.
"This egregious mismatch shows just one of the dangers of facial recognition technology which has already been shown in study after study to fail people of color, people with dark skin more than white counterparts generally speaking," Buolamwini said.
"You cannot erase the experience of 30 hours detained, the memories of children seeing their father arrested, or the stigma of being labeled criminal."
Read the full article about racial bias in facial recognition technology by Bobby Allyn at NPR.