Giving Compass' Take:

• Rachel Kraus reports that Microsoft has chosen not to sell facial recognition tech to law enforcement in California, although the tech is used in prisons. Amazon is battling racial bias claims against its facial recognition tech. 

• How can the damage of this type of technology be limited? How can companies work to eliminate bias in technology? 

• Learn about racial disparities in stop and frisk


Microsoft President Brad Smith delivered some surprisingly principled news about his company while speaking at Stanford University. Recently, Smith said that Microsoft declined to sell its facial recognition technology to both a California law enforcement agency and an unnamed capital city because of human rights concerns, according to Reuters.

That's in contrast to Amazon, which defends its contracts with law enforcement agencies that use its Rekognition software, and has sought to discredit an ACLU study that showed racial bias in Rekognition. The ACLU study and others have found that facial recognition AI is less accurate at identifying women and minorities than white men. Because of this bias, Smith said that use by law enforcement could disproportionately harm these groups.

"Anytime they pulled anyone over, they wanted to run a face scan," Smith said. "We said this technology is not your answer."

That is, the potential for misidentifying someone at a simple traffic stop as a potential suspect was too great for Microsoft to sell the agency the technology. Smith also said that it denied the capital city in the unnamed country the technology because the blanket surveillance it wanted to implement would impede freedom of assembly.

However, Microsoft has sold the technology to a U.S. prison. Smith said that the limited scope of use would assuage these concerns, and had the potential to improve safety inside the prison.

Read the full article about concerns about facial recognition tech by Rachel Kraus at Mashable.