A reading list of articles and other links I use to inform my work at Small Technology Foundation, posted every weekday. Continued from the Ind.ie Radar, and Ind.ie’s Weekly Roundups. Subscribe to the Laura’s Lens RSS feed.
Written by Alfred Ng on CNET.
“While the features that come with smart locks or doorbell cameras offer conveniences for homeowners, they open up concerns about privacy for renters – who might not have signed on for constant surveillance.”
“Facial recognition and emerging forms of AI give landlords alarming power to harass rent-stabilized tenants.”
Written by Dorian Lynskey on The Guardian.
“You are building an infrastructure that can be later co-opted in undesirable ways by large multinationals and state surveillance apparatus, and compromised by malicious hackers,” says Dr Michael Veale, a lecturer in digital rights and regulation at UCL Faculty of Laws at University College London.
Written by Meredith Broussard on Slate.
“This is not about math, but about human social values being superimposed on a mathematical system. The question becomes: Whose values are encoded in the system?”
“That trans and gender nonconforming people are excluded from or subjugated to information systems is a phenomenon [Anna Lauren Hoffmann] labels data violence, or ‘Harm inflicted on trans and gender nonconforming people not only by government-run systems, but also the information systems that permeate our everyday social lives.’”
Written by Evan Greer on Buzzfeed News.
“The surveillance dystopia is on the horizon, and companies like Microsoft and Amazon are helping build it. Despite their platitudes of caution and ethics, we’ve seen the consequences of Silicon Valley’s “move fast and break things” ethos. And if we don’t stop the spread of facial recognition, its latest lucrative surveillance product, we’ll soon count our most basic freedoms among the things they’ve broken.”
“Company after company in Silicon Valley has been pushing furiously ahead with the development of face-scanning surveillance tools. They see money to be made selling this tech to governments, airlines, and other private businesses. Facing growing concern from the public and lawmakers, the industry has disingenuously asked for “regulation.” This is straight out of Big Tech’s lobbying playbook — asking Congress to pass laws and then swooping in to help write them. By doing so, they hope to avoid the real debate: whether facial recognition surveillance should be allowed at all.”
“There is no time to waste. Authoritarian surveillance programs are always used to target the most vulnerable and marginalized, and facial recognition enables the automation of oppression.”
Written by Lois Beckett on The Guardian.
“Unlike gun control, Marlow said, ‘Surveillance is politically palatable, and so they’re pursuing surveillance as a way you can demonstrate action, even though there’s no evidence that it will positively impact the problem.’” … “Some people think that technology is magic, that artificial intelligence will save us,” Vance said. “A lot of the questions and a lot of the privacy concerns haven’t [been] thought of, let alone addressed.” … “For black students, and students with disabilities, who already face a disproportionate amount of harsh disciplinary measures, the introduction of new kinds of surveillance may be especially harmful, privacy experts said.”
Written by Cecilia D'Anastasio and Dhruv Mehrotra on Kotaku.
“Ubiquitous computing is still a fantasy, but not because the technology isn’t ready. It is. The fantasy is that any system mediating someone’s personal experience of the physical world that uses a modern corporation’s digital infrastructure would be objective or neutral. Humans are data and data is money, and this is the business model of many of the technology firms up to the task of ubiquitous computing.”
Written by April Glaser on Slate.
“Yes, all markets require a level of privacy in order to operate. You can’t know the political leaning of everyone you buy a sandwich from. Vendors can decide what they do or don’t want to disclose or ask of their customers. But when they do know, they have no obligation to proceed with that business. Activists and tech critics sometimes use the word complicit when talking about companies that look the other way when their inventions are causing harm. Assistive might be more accurate. Providing database and web services—even just email—to a cruel immigration regime assists in the cruelty.”
“These companies can do what they want with the software they sell. But they should stop pretending that what they sell is neutral.”
Written by Mike Ananny on Nieman Lab.
“Note that I haven’t asked: “What’s the impact of technology on society?” That’s the wrong question. Platforms are societies of intertwined people and machines. There is no such thing as “online life” versus “real life.” We give massive ground if we pretend that these companies are simply having an “effect” or “impact” on some separate society.”
Written by Rose Eveleth on Vox.
“[T]he assertion that technology companies can’t possibly be shaped or restrained with the public’s interest in mind is to argue that they are fundamentally different from any other industry. They’re not.”
“There’s a growing chasm between how everyday users feel about the technology around them and how companies decide what to make. And yet, these companies say they have our best interests in mind. We can’t go back, they say. We can’t stop the “natural evolution of technology.” But the “natural evolution of technology” was never a thing to begin with, and it’s time to question what “progress” actually means.”
Written by Anna Devlin on The Guardian.
“Out in the wider world, anonymity is no longer guaranteed. Facial recognition gives police and companies the means of identifying and tracking people of interest, while others are free to go about their business. The real question is: who gets that privilege?”