A reading list of articles and other links I use to inform my work at Small Technology Foundation, posted every weekday. Continued from the Ind.ie Radar, and Ind.ie’s Weekly Roundups. Subscribe to the Laura’s Lens RSS feed.
Written by Sherrell Dorsey on Essence.
“By rooting out bias in technology, these Black women engineers, professors and government experts are on the front lines of the civil rights movement of our time.”
Written by Megan Wollerton on CNET.
“It’s a complete anomaly – a solidly performing, decently priced device that just isn’t suited for anyone because of the privacy concerns and increasingly alarming issues plaguing the social networking site.”
Written by Yaël Eisenstat on The Washington Post.
“[T]rue transparency would include information about the tools that differentiate advertising on Facebook from traditional print and television, and in fact make it more dangerous: Can I see if a political advertiser used the custom audience tool, and if so, if my email address was uploaded? Can I see what look-alike audience advertisers are seeking? Can I see a true, verified name of the advertiser in the disclaimer? Can I see if and how your algorithms amplified the ad? If not, the claim that Facebook is simply providing a level playing field for free expression is a myth.
Free political speech is core to our democratic principles, and it’s true that social media companies should not be the arbiters of truth. But the only way Facebook or other companies that use our behavioral data to potentially manipulate us through targeted advertising can prevent abuse of their platform to harm our electoral process is to end their most egregious targeting and amplification practices and provide real transparency.
We need lawmakers and regulators to help protect our children, our cognitive capabilities, our public square and our democracy by creating guardrails and rules to deal directly with the incentives and business models of these platforms and the societal harms they are causing.”
Written by Alfred Ng on CNET.
“While the features that come with smart locks or doorbell cameras offer conveniences for homeowners, they open up concerns about privacy for renters – who might not have signed on for constant surveillance.”
“Facial recognition and emerging forms of AI give landlords alarming power to harass rent-stabilized tenants.”
Written by Dorian Lynskey on The Guardian.
“You are building an infrastructure that can be later co-opted in undesirable ways by large multinationals and state surveillance apparatus, and compromised by malicious hackers,” says Dr Michael Veale, a lecturer in digital rights and regulation at UCL Faculty of Laws at University College London.
Written by Meredith Broussard on Slate.
“This is not about math, but about human social values being superimposed on a mathematical system. The question becomes: Whose values are encoded in the system?”
“That trans and gender nonconforming people are excluded from or subjugated to information systems is a phenomenon [Anna Lauren Hoffmann] labels data violence, or ‘Harm inflicted on trans and gender nonconforming people not only by government-run systems, but also the information systems that permeate our everyday social lives.’”
Written by Evan Greer on Buzzfeed News.
“The surveillance dystopia is on the horizon, and companies like Microsoft and Amazon are helping build it. Despite their platitudes of caution and ethics, we’ve seen the consequences of Silicon Valley’s “move fast and break things” ethos. And if we don’t stop the spread of facial recognition, its latest lucrative surveillance product, we’ll soon count our most basic freedoms among the things they’ve broken.”
“Company after company in Silicon Valley has been pushing furiously ahead with the development of face-scanning surveillance tools. They see money to be made selling this tech to governments, airlines, and other private businesses. Facing growing concern from the public and lawmakers, the industry has disingenuously asked for “regulation.” This is straight out of Big Tech’s lobbying playbook — asking Congress to pass laws and then swooping in to help write them. By doing so, they hope to avoid the real debate: whether facial recognition surveillance should be allowed at all.”
“There is no time to waste. Authoritarian surveillance programs are always used to target the most vulnerable and marginalized, and facial recognition enables the automation of oppression.”
Written by Lois Beckett on The Guardian.
“Unlike gun control, Marlow said, ‘Surveillance is politically palatable, and so they’re pursuing surveillance as a way you can demonstrate action, even though there’s no evidence that it will positively impact the problem.’” … “Some people think that technology is magic, that artificial intelligence will save us,” Vance said. “A lot of the questions and a lot of the privacy concerns haven’t [been] thought of, let alone addressed.” … “For black students, and students with disabilities, who already face a disproportionate amount of harsh disciplinary measures, the introduction of new kinds of surveillance may be especially harmful, privacy experts said.”
Written by Cecilia D'Anastasio and Dhruv Mehrotra on Kotaku.
“Ubiquitous computing is still a fantasy, but not because the technology isn’t ready. It is. The fantasy is that any system mediating someone’s personal experience of the physical world that uses a modern corporation’s digital infrastructure would be objective or neutral. Humans are data and data is money, and this is the business model of many of the technology firms up to the task of ubiquitous computing.”
Written by April Glaser on Slate.
“Yes, all markets require a level of privacy in order to operate. You can’t know the political leaning of everyone you buy a sandwich from. Vendors can decide what they do or don’t want to disclose or ask of their customers. But when they do know, they have no obligation to proceed with that business. Activists and tech critics sometimes use the word complicit when talking about companies that look the other way when their inventions are causing harm. Assistive might be more accurate. Providing database and web services—even just email—to a cruel immigration regime assists in the cruelty.”
“These companies can do what they want with the software they sell. But they should stop pretending that what they sell is neutral.”