Laura Kalbag

How Algorithmic Bias Hurts People With Disabilities

Written by Alexanda Reeve Givens on Slate.

“In hiring, for example, new algorithm-driven tools will identify characteristics shared by a company’s “successful” existing employees, then look for those traits when they evaluate new hires. But as the model treats underrepresented traits as undesired traits to receive less weighting, people with disabilities—like other marginalized groups—risk being excluded as a matter of course.

While some have called to fix this data problem by collecting more detailed information about job candidates’ disabilities, further collection raises its own distinct and very real concerns about privacy and discrimination.

These problems exist for others, too: people who have marginalized sexual orientations or nonbinary gender identities, those who fall outside U.S. definitions of race and ethnicity, and for people who are members of multiple, intersecting marginalized communities.”

Read ‘How Algorithmic Bias Hurts People With Disabilities’ on the Slate site.

Tagged with: disability, discrimination, algorithms.