A reading list of articles and other links I use to inform my work at Small Technology Foundation, posted every weekday. Continued from the Ind.ie Radar, and Ind.ie’s Weekly Roundups. Subscribe to the Laura’s Lens RSS feed.
I worked on political ads at Facebook. They profit by manipulating us.
Written by Yaël Eisenstat on The Washington Post.
“[T]rue transparency would include information about the tools that differentiate advertising on Facebook from traditional print and television, and in fact make it more dangerous: Can I see if a political advertiser used the custom audience tool, and if so, if my email address was uploaded? Can I see what look-alike audience advertisers are seeking? Can I see a true, verified name of the advertiser in the disclaimer? Can I see if and how your algorithms amplified the ad? If not, the claim that Facebook is simply providing a level playing field for free expression is a myth.
Free political speech is core to our democratic principles, and it’s true that social media companies should not be the arbiters of truth. But the only way Facebook or other companies that use our behavioral data to potentially manipulate us through targeted advertising can prevent abuse of their platform to harm our electoral process is to end their most egregious targeting and amplification practices and provide real transparency.
We need lawmakers and regulators to help protect our children, our cognitive capabilities, our public square and our democracy by creating guardrails and rules to deal directly with the incentives and business models of these platforms and the societal harms they are causing.”
Smart home tech can help evict renters, surveillance company tells landlords
Written by Alfred Ng on CNET.
“While the features that come with smart locks or doorbell cameras offer conveniences for homeowners, they open up concerns about privacy for renters – who might not have signed on for constant surveillance.”
“Facial recognition and emerging forms of AI give landlords alarming power to harass rent-stabilized tenants.”
‘Alexa, are you invading my privacy?’ – the dark side of our voice assistants
Written by Dorian Lynskey on The Guardian.
“You are building an infrastructure that can be later co-opted in undesirable ways by large multinationals and state surveillance apparatus, and compromised by malicious hackers,” says Dr Michael Veale, a lecturer in digital rights and regulation at UCL Faculty of Laws at University College London.
When Binary Code Won’t Accommodate Nonbinary People
Written by Meredith Broussard on Slate.
“This is not about math, but about human social values being superimposed on a mathematical system. The question becomes: Whose values are encoded in the system?”
“That trans and gender nonconforming people are excluded from or subjugated to information systems is a phenomenon [Anna Lauren Hoffmann] labels data violence, or ‘Harm inflicted on trans and gender nonconforming people not only by government-run systems, but also the information systems that permeate our everyday social lives.’”
Don’t Regulate Facial Recognition. Ban It.
Written by Evan Greer on Buzzfeed News.
“The surveillance dystopia is on the horizon, and companies like Microsoft and Amazon are helping build it. Despite their platitudes of caution and ethics, we’ve seen the consequences of Silicon Valley’s “move fast and break things” ethos. And if we don’t stop the spread of facial recognition, its latest lucrative surveillance product, we’ll soon count our most basic freedoms among the things they’ve broken.”
“Company after company in Silicon Valley has been pushing furiously ahead with the development of face-scanning surveillance tools. They see money to be made selling this tech to governments, airlines, and other private businesses. Facing growing concern from the public and lawmakers, the industry has disingenuously asked for “regulation.” This is straight out of Big Tech’s lobbying playbook — asking Congress to pass laws and then swooping in to help write them. By doing so, they hope to avoid the real debate: whether facial recognition surveillance should be allowed at all.”
“There is no time to waste. Authoritarian surveillance programs are always used to target the most vulnerable and marginalized, and facial recognition enables the automation of oppression.”
Under digital surveillance: how American schools spy on millions of kids
Written by Lois Beckett on The Guardian.
“Unlike gun control, Marlow said, ‘Surveillance is politically palatable, and so they’re pursuing surveillance as a way you can demonstrate action, even though there’s no evidence that it will positively impact the problem.’” … “Some people think that technology is magic, that artificial intelligence will save us,” Vance said. “A lot of the questions and a lot of the privacy concerns haven’t [been] thought of, let alone addressed.” … “For black students, and students with disabilities, who already face a disproportionate amount of harsh disciplinary measures, the introduction of new kinds of surveillance may be especially harmful, privacy experts said.”
The Creators Of Pokémon Go Mapped The World. Now They're Mapping You
Written by Cecilia D'Anastasio and Dhruv Mehrotra on Kotaku.
“Ubiquitous computing is still a fantasy, but not because the technology isn’t ready. It is. The fantasy is that any system mediating someone’s personal experience of the physical world that uses a modern corporation’s digital infrastructure would be objective or neutral. Humans are data and data is money, and this is the business model of many of the technology firms up to the task of ubiquitous computing.”
Is a Tech Company Ever Neutral?
Written by April Glaser on Slate.
“Yes, all markets require a level of privacy in order to operate. You can’t know the political leaning of everyone you buy a sandwich from. Vendors can decide what they do or don’t want to disclose or ask of their customers. But when they do know, they have no obligation to proceed with that business. Activists and tech critics sometimes use the word complicit when talking about companies that look the other way when their inventions are causing harm. Assistive might be more accurate. Providing database and web services—even just email—to a cruel immigration regime assists in the cruelty.”
“These companies can do what they want with the software they sell. But they should stop pretending that what they sell is neutral.”
Tech platforms are where public life is increasingly constructed, and their motivations are far from neutral
Written by Mike Ananny on Nieman Lab.
“Note that I haven’t asked: “What’s the impact of technology on society?” That’s the wrong question. Platforms are societies of intertwined people and machines. There is no such thing as “online life” versus “real life.” We give massive ground if we pretend that these companies are simply having an “effect” or “impact” on some separate society.”
The biggest lie tech people tell themselves — and the rest of us
Written by Rose Eveleth on Vox.
“[T]he assertion that technology companies can’t possibly be shaped or restrained with the public’s interest in mind is to argue that they are fundamentally different from any other industry. They’re not.”
“There’s a growing chasm between how everyday users feel about the technology around them and how companies decide what to make. And yet, these companies say they have our best interests in mind. We can’t go back, they say. We can’t stop the “natural evolution of technology.” But the “natural evolution of technology” was never a thing to begin with, and it’s time to question what “progress” actually means.”