Giving Compass' Take:

· In order to produce more inclusive software for the hiring process, Matt Shipman urges developers to integrate “feminist design thinking” into their creative process.

· How can donors help remove these biases in the employment process?

· Read more about reducing inequality and bias in employment.


“There seem to be countless stories of ways that bias in AI is manifesting itself, and there are many thought pieces out there on what contributes to this bias,” says co-lead author Fay Payton, a professor of information systems/technology at North Carolina State University.

“Our goal here was to put forward guidelines that can be used to develop workable solutions to algorithm bias against women, African American, and Latinx professions in the IT workforce.

“Too many existing hiring algorithms incorporate de facto identity markers that exclude qualified candidates because of their gender, race, ethnicity, age, and so on,” says Payton. “We are simply looking for equity—that job candidates be able to participate in the hiring process on an equal footing.”

Payton and her collaborators argue that an approach called feminist design thinking could serve as a valuable framework for developing software that reduces algorithmic bias in a meaningful way. In this context, the application of feminist design thinking would mean incorporating the idea of equity into the design of the algorithm itself.

“Compounding the effects of algorithmic bias is the historical underrepresentation of women, black, and Latinx software engineers to provide novel insights regarding equitable design approaches based on their lived experiences,” says co-lead author Lynette Yarger, an associate professor of information sciences and technology at Penn State.

Read the full article about feminist design and hiring bias by Matt Shipman at Futurity.