What is Giving Compass?
We connect donors to learning resources and ways to support community-led solutions. Learn more about us.
Giving Compass' Take:
• Cameron F. Kerry explains how AI and analytics enforce racial bias through subtle algorithmic discrimination, creating a need for privacy legislation.
• How do the consequences of advancements, such as analytics, reflect the nature of their innovators? What can we do to expose and alter the implicitly racist effects of AI and analytics?
• Learn about how machine learning and analytics enforce racial bias within the criminal justice system.
Concerns that personal information might be used in ways that perpetuate or exacerbate bias have become more prominent as predictive analytics, machine learning and artificial intelligence magnify the power to use personal information in granular ways. As a White House task force concluded in 2014, while data can be used for good, it “can also be used in ways that perpetuate social harms or render outcomes that have inequitable impacts, even when discrimination is not intended.”
In the contexts of policing and criminal justice, this concern intensifies. The death of George Floyd is only one of the most recent examples of the disproportionate impact of systemic discrimination on the lives of disadvantaged people, especially Black Americans.
This concern also applies to more subtle impacts in the commercial arena where hidden proxies can operate to reduce opportunities for Black Americans and other disadvantaged populations. In a stark example, Latanya Sweeney, a Harvard professor and former Federal Trade Commission (FTC) chief technology officer, demonstrated that online searches using names associated with African Americans were more likely to generate advertisements relating to arrest records and less favorable credit cards.
Moreover, the discrimination covered by the Civil Rights Act of 1964 and its progeny envisioned human agency—decisions by restaurant owners, landlords and other people. But in the 21st century, decisions can be made by machines or software—without a human in the loop. Deconstructing the basis for these decisions is a difficult undertaking of a different order from traditional employment or housing discrimination cases. This task requires new tools, and privacy legislation can help supply them.
Read the full article about how analytics enforce racial bias by Cameron F. Kerry at Brookings.