Tag: discrimination
-
‘They track every move’: how US parole apps created digital prisoners
Written by Todd Feathers on The Guardian.
“Critics also argue that the data-gathering and experimental predictive analytics incorporated into some tracking apps are bound to generate false positives that lead to arrests for technical violations of probation or parole conditions.”
…
”Often it’s people of colour who are having their data extracted from them. This valuable commodity is literally the body of black individuals” -Prof Chaz Arnett, Maryland University
Tagged with: tracking, discrimination, prisons.
-
Can Auditing Eliminate Bias from Algorithms?
Written by Alfred Ng on The Markup.
“Increasingly, companies are turning to these firms to review their algorithms, particularly when they’ve faced criticism for biased outcomes, but it’s not clear whether such audits are actually making algorithms less biased—or if they’re simply good PR.”
Read ‘Can Auditing Eliminate Bias from Algorithms?’ on the The Markup site.
Tagged with: algorithmic bias, auditing, discrimination.
-
Humans are not the virus: don’t be an eco-fascist
Written by Sherronda J. Brown on gal-dem.
“Eco-fascist rhetoric works to obscure the responsibility of white colonialism and its long history of destruction, as well as imperialist presences in predominantly black and brown countries”
Read ‘Humans are not the virus: don’t be an eco-fascist’ on the gal-dem site.
Tagged with: ecofacism, coronavirus, discrimination.
-
OK Google, Black History Month Is Over. What Now?
Written by Gabrielle Rejouis on OneZero.
“Despite the benefits Google has received from the Black community, the company has refused to or has been slow to correct the discriminatory algorithmic practices at YouTube, such as its language filter, ads, and its search algorithms. Whether intentional or unconscious, all of these biases have harmed the Black community. For some people, Google is the internet. Civil rights considerations must be central to big data and the platforms they drive. Google should not celebrate the contributions of Black people without also making their platforms welcoming to them.
…
Technology will not be the silver bullet solving the problem of content moderation. Neither will sensitivity training nor diverse hiring. Dismantling these structures will require racial literacy and more multifaceted changes.
…
To say the internet has a huge impact on our society is an understatement. And the data and privacy missteps committed by Big Tech disproportionately affect historically marginalized communities.”
Read ‘OK Google, Black History Month Is Over. What Now?’ on the OneZero site.
Tagged with: Google, discrimination, algorithm.
-
How Algorithmic Bias Hurts People With Disabilities
Written by Alexanda Reeve Givens on Slate.
“In hiring, for example, new algorithm-driven tools will identify characteristics shared by a company’s “successful” existing employees, then look for those traits when they evaluate new hires. But as the model treats underrepresented traits as undesired traits to receive less weighting, people with disabilities—like other marginalized groups—risk being excluded as a matter of course.
…
While some have called to fix this data problem by collecting more detailed information about job candidates’ disabilities, further collection raises its own distinct and very real concerns about privacy and discrimination.
These problems exist for others, too: people who have marginalized sexual orientations or nonbinary gender identities, those who fall outside U.S. definitions of race and ethnicity, and for people who are members of multiple, intersecting marginalized communities.”
Read ‘How Algorithmic Bias Hurts People With Disabilities’ on the Slate site.
Tagged with: disability, discrimination, algorithms.
-
I’m a trans woman. Google Photos doesn’t know how to categorize me
Written by Cara Esten Hustle on Fast Company.
“The same data set that could be used to build a system to prevent showing trans folks photos from before they started transition could be trivially used and weaponized by an authoritarian state to identify trans people from street cameras,” [Penelope] Phippen says.
With this dystopian future in mind, coupled with the fact that federal agencies like ICE already use facial recognition technology for immigration enforcement, do we even want machine learning to piece together a coherent identity from both pre- and post-transition images?
…
With trans people facing daily harassment simply for existing as ourselves, the stakes seem too high to risk teaching these systems how to recognize us”
This made me think of Tatiana Mac’s brilliant ‘The Banal Binary’ talk at New Adventures conference two weeks ago.
Read ‘I’m a trans woman. Google Photos doesn’t know how to categorize me’ on the Fast Company site.
Tagged with: facial recognition, discrimination, systems.
-
Technology Can't Fix Algorithmic Injustice
Written by Annette Zimmermann, Elena Di Rosa, Hochan Kim on Boston Review.
“Some contend that strong AI may be only decades away, but this focus obscures the reality that “weak” (or “narrow”) AI is already reshaping existing social and political institutions. Algorithmic decision making and decision support systems are currently being deployed in many high-stakes domains, from criminal justice, law enforcement, and employment decisions to credit scoring, school assignment mechanisms, health care, and public benefits eligibility assessments. Never mind the far-off specter of doomsday; AI is already here, working behind the scenes of many of our social systems.
What responsibilities and obligations do we bear for AI’s social consequences in the present—not just in the distant future? To answer this question, we must resist the learned helplessness that has come to see AI development as inevitable. Instead, we should recognize that developing and deploying weak AI involves making consequential choices—choices that demand greater democratic oversight not just from AI developers and designers, but from all members of society.
…
There may be some machine learning systems that should not be deployed in the first place, no matter how much we can optimize them.”
Read ‘Technology Can't Fix Algorithmic Injustice’ on the Boston Review site.
Tagged with: algorithms, artificial intelligence, discrimination.
-
How “Good Intent” Undermines Diversity and Inclusion
Written by Annalee on The Bias.
“‘Assume good intent’ is a particularly pernicious positive expectation that will undermine your code of conduct. The implied inverse of this is that not assuming good intent is against the rules.
…
The harm is that telling people to “assume good intent” is a sign that if they come to you with a concern, you will minimize their feelings, police their reactions, and question their perceptions. It tells marginalized people that you don’t see codes of conduct as tools to address systemic discrimination, but as tools to manage personal conflicts without taking power differences into account. Telling people to “assume good intent” sends a message about whose feelings you plan to center when an issue arises in your community.
…
If you want to build a culture of ‘assuming good intent,’ start by assuming good intent in marginalized people.”
Read ‘How “Good Intent” Undermines Diversity and Inclusion’ on the The Bias site.
Tagged with: inclusion, intent, discrimination.
-
Big Data and the Underground Railroad
Written by Alvaro M. Bedoya on Slate.
“Far too often, today’s discrimination was yesterday’s national security or public health necessity. An approach that advocates ubiquitous data collection and protects privacy solely through post-collection use restrictions doesn’t account for that.”
Read ‘Big Data and the Underground Railroad’ on the Slate site.
Tagged with: big data, discrimination, privacy.
-
The biggest myths about the next billion internet users
Written by Payal Arora on Quartz.
“We need to de-exoticize these users if we are going to genuinely have a healthy global digital culture. They need to be humanized, understood, and kept in mind when designing inclusive platforms. The internet is a critical public resource that is meant for all users—and that includes the world’s poor.”
Read ‘The biggest myths about the next billion internet users’ on the Quartz site.
Tagged with: society, poverty, discrimination.