Tag: discrimination
-
AI thinks like a corporation—and that’s worrying
Written by Jonnie Penn on The Economist.
“After the 2010 BP oil spill, for example, which killed 11 people and devastated the Gulf of Mexico, no one went to jail. The threat that Mr Runciman cautions against is that AI techniques, like playbooks for escaping corporate liability, will be used with impunity.
Today, pioneering researchers such as Julia Angwin, Virginia Eubanks and Cathy O’Neil reveal how various algorithmic systems calcify oppression, erode human dignity and undermine basic democratic mechanisms like accountability when engineered irresponsibly. Harm need not be deliberate; biased data-sets used to train predictive models also wreak havoc.
…
A central promise of AI is that it enables large-scale automated categorisation… This “promise” becomes a menace when directed at the complexities of everyday life. Careless labels can oppress and do harm when they assert false authority.”
Read ‘AI thinks like a corporation—and that’s worrying’ on the The Economist site.
Tagged with: artificial intelligence, corporation, discrimination.
-
These Black Women Are Fighting For Justice In A World Of Biased Algorithms
Written by Sherrell Dorsey on Essence.
“By rooting out bias in technology, these Black women engineers, professors and government experts are on the front lines of the civil rights movement of our time.”
Tagged with: algorithms, discrimination, facial recognition.
-
Smart home tech can help evict renters, surveillance company tells landlords
Written by Alfred Ng on CNET.
“While the features that come with smart locks or doorbell cameras offer conveniences for homeowners, they open up concerns about privacy for renters – who might not have signed on for constant surveillance.”
…
“Facial recognition and emerging forms of AI give landlords alarming power to harass rent-stabilized tenants.”
Read ‘Smart home tech can help evict renters, surveillance company tells landlords’ on the CNET site.
Tagged with: smart home, surveillance, discrimination.
-
When Binary Code Won’t Accommodate Nonbinary People
Written by Meredith Broussard on Slate.
“This is not about math, but about human social values being superimposed on a mathematical system. The question becomes: Whose values are encoded in the system?”
…
“That trans and gender nonconforming people are excluded from or subjugated to information systems is a phenomenon [Anna Lauren Hoffmann] labels data violence, or ‘Harm inflicted on trans and gender nonconforming people not only by government-run systems, but also the information systems that permeate our everyday social lives.’”
Read ‘When Binary Code Won’t Accommodate Nonbinary People’ on the Slate site.
Tagged with: discrimination, gender, systemic issues.
-
Sorry, But Male Geniuses Are Replaceable
Written by Sarah Olson on Medium.
“Perhaps it’s easier to separate the scientist from his science when you’re not, and never will be, affected personally by misogyny and sexism.”
“Unfortunately, many women in STEM are adversely affected by misogynists and sexists when those men are highly regarded and respected within the scientific community. After all, sexual harassment isn’t really about sex — it’s about power.”
“[W]hat’s worth more, the contributions of a lone male genius who assaults and harasses and discriminates against women, or the contributions of a large scientific community unhindered by a misogynistic and unsafe environment?”
The same goes for the tech community… and also for the racist and white supremacist ideologies held within.
Read ‘Sorry, But Male Geniuses Are Replaceable’ on the Medium site.
Tagged with: misogyny, community, discrimination.
-
At the Border of Europe's Surveillance State
Written by Nicole Chi on Are We Europe.
“[E]ven if not lawless, historically, borders have been vulnerable places for human rights—particularly the right to privacy—as border guards extend government intrusion into our private lives with the authority of upholding national security. Now, data collection and artificial intelligence are threatening to turn borders into an underregulated free-for-all.”
Read ‘At the Border of Europe's Surveillance State’ on the Are We Europe site.
Tagged with: privacy, surveillance, discrimination.
-
The creeping threat of facial recognition
Written by S.A. Applin on Fast Company.
“Once facial recognition and other AI becomes pervasive—and in the absence of serious enforceable laws that can put guardrails on the technology—we will be unprotected, and as such will be subjected to any purpose to which the government or business wants to put our identities and locations. This is where greed, profit, and power come into play as motivators.”
Read ‘The creeping threat of facial recognition’ on the Fast Company site.
Tagged with: facial recognition, surveillance, discrimination.
-
The Devastating Consequences of Being Poor in the Digital Age
Written by Mary Madden on The New York Times.
“The poor experience these two extremes — hypervisibility and invisibility — while often lacking the agency or resources to challenge unfair outcomes. For instance, they may be unfairly targeted by predictive policing tools designed with biased training data or unfairly excluded from hiring algorithms that scour social media networks to make determinations about potential candidates. In this increasingly complex ecosystem of “networked privacy harms,” one-size-fits-all privacy solutions will not serve all communities equally. Efforts to create a more ethical technology sector must take the unique experiences of vulnerable and marginalized users into account.”
Read ‘The Devastating Consequences of Being Poor in the Digital Age’ on the The New York Times site.
Tagged with: privacy, surveillance, discrimination.
-
Communities at risk: How security fails are endangering the LBGTIQ+ community
Written by Privacy International staff on Privacy International.
“This enables governments and companies to construct profiles of them, using these highly sensitive details to make inferences or predictions that may or may not be accurate. Increasingly, profiles are being used to make or inform consequential decisions, from credit scoring, to hiring, to policing.”
Tagged with: discrimination, security, profiling.
-
Discrimination’s Digital Frontier
Written by Aaron Rieke and Corrine Yu on The Atlantic.
“A recent study led by researchers at Northeastern University and the University of Southern California shows that, given a large group of people who might be eligible to see an advertisement, Facebook will pick among them based on its own profit-maximizing calculations, sometimes serving ads to audiences that are skewed heavily by race and gender.”
“An ad system that is designed to maximize clicks, and to maximize profits for Facebook, will naturally reinforce these social inequities and so serve as a barrier to equal opportunity.”
Read ‘Discrimination’s Digital Frontier’ on the The Atlantic site.
Tagged with: discrimination, ads, Facebook.