A reading list of articles and other links I use to inform my work at Small Technology Foundation, posted every weekday. Continued from the Ind.ie Radar, and Ind.ie’s Weekly Roundups. Subscribe to the Laura’s Lens RSS feed.
Written by Joshua Benton on The Atlantic.
“News organizations have multiple and sometimes conflicting incentives that might affect how they present the local police blotter. A company that sells security-optimized doorbells has only one incentive: emphasizing that the world is a scary place, and you need to buy our products to protect you.”
Written by Olivia Solon and Cyrus Farivar on NBC News.
“Ever AI promises prospective military clients that it can ‘enhance surveillance capabilities’ and ‘identify and act on threats.’ It offers law enforcement the ability to identify faces in body-cam recordings or live video feeds.”
Written by S.A. Applin on Fast Company.
“Once facial recognition and other AI becomes pervasive—and in the absence of serious enforceable laws that can put guardrails on the technology—we will be unprotected, and as such will be subjected to any purpose to which the government or business wants to put our identities and locations. This is where greed, profit, and power come into play as motivators.”
Written by Stuart A. Thompson on The New York Times.
“Today’s data providers can receive information from almost every imaginable part of your life: your activity on the internet, the places you visit, the stores you walk through, the things you buy, the things you like, who your friends are, the places your friends go, the things your friends do, and on and on.”
Written by Mary Madden on The New York Times.
“The poor experience these two extremes — hypervisibility and invisibility — while often lacking the agency or resources to challenge unfair outcomes. For instance, they may be unfairly targeted by predictive policing tools designed with biased training data or unfairly excluded from hiring algorithms that scour social media networks to make determinations about potential candidates. In this increasingly complex ecosystem of “networked privacy harms,” one-size-fits-all privacy solutions will not serve all communities equally. Efforts to create a more ethical technology sector must take the unique experiences of vulnerable and marginalized users into account.”
Written by Jessica Guynn on USA Today.
“Facebook is not looking to protect me or any other person of color or any other marginalized citizen who are being attacked by hate speech,” [Carolyn Wysinger] says. “We get trolls all the time. People who troll your page and say hateful things. But nobody is looking to protect us from it. They are just looking to protect their bottom line.”
Written by Nicholas Vinocur on Politico.
“Ireland’s failure to safeguard huge stores of personal information looms larger now that the country is the primary regulator responsible for protecting the health information, email addresses, financial records, relationship status, search histories and friend lists for hundreds of millions of Americans, Europeans and other users around the globe.”
“Despite its vows to beef up its threadbare regulatory apparatus, Ireland has a long history of catering to the very companies it is supposed to oversee…”
Written by Rachel Becker on The Verge.
“33 of the 36 apps shared information that could give advertisers or data analytics companies insights into people’s digital behavior. And a few shared very sensitive information, like health diary entries, self reports about substance use, and usernames.”
“Potentially advertisers could use this to compromise someone’s privacy and sway their treatment decisions…”
Written by Woodrow Hartzog and Evan Selinger on New York Times.
“Obscurity bridges this privacy gap with the idea that the parts of our lives that are hard or unlikely to be found or understood are relatively safe. It is a combination of the privacy you have in public and the privacy you have in groups. Obscurity is a barrier that can shield you from government, corporate and social snoops. And until lawmakers, corporate leaders and citizens embrace obscurity and move to protect it, your freedom and opportunities to flourish will be in jeopardy.”
Written by Privacy International staff on Privacy International.
“This enables governments and companies to construct profiles of them, using these highly sensitive details to make inferences or predictions that may or may not be accurate. Increasingly, profiles are being used to make or inform consequential decisions, from credit scoring, to hiring, to policing.”