A reading list of articles and other links I use to inform my work at Small Technology Foundation, posted every weekday. Continued from the Ind.ie Radar, and Ind.ie’s Weekly Roundups. Subscribe to the Laura’s Lens RSS feed.
Written by Michael Grothaus on Fast Company.
“Seventeen dollars for a smartphone sounds like a great deal, especially for people living in poverty who can barely afford rent.
But there’s a problem: low-cost smartphones are privacy nightmares.”
“The MYA2 also has apps that can’t be updated or deleted, and those apps contain multiple security and privacy flaws. One of those pre-installed apps that can’t be removed, Facebook Lite, gets default permission to track everywhere you go, upload all your contacts, and read your phone’s calendar.”
“While companies like Apple are to be lauded for prioritizing privacy protections, people around the world should not be reliant on tech giants building privacy safeguards for only a population that can afford it.”
Written by McKenzie Funk on The New York Times Magazine.
“For decades, the overriding objective of American business and government has been to remove friction from the tracking system, by linking networks, by speeding connections, by eliminating barriers. But friction is the only thing that has ever made privacy, let alone obscurity, possible. If there’s no friction, if we can all be profiled instantly and intimately, then there’s nothing to stop any of our neighbors from being targeted — nothing, that is, except our priorities.”
A long, sickening, read.
Written by Natasha Lomas on TechCrunch.
“Europe’s top court has ruled that pre-checked consent boxes for dropping cookies are not legally valid.
Consent must be obtained prior to storing or accessing non-essential cookies, such as tracking cookies for targeted advertising. Consent cannot be implied or assumed.”
“Sites that have relied upon opting EU users into ad-tracking cookies in the hopes they’ll just click okay to make the cookie banner go away are in for a rude awakening.”
Written by Sarah Olson on Medium.
“Perhaps it’s easier to separate the scientist from his science when you’re not, and never will be, affected personally by misogyny and sexism.”
“Unfortunately, many women in STEM are adversely affected by misogynists and sexists when those men are highly regarded and respected within the scientific community. After all, sexual harassment isn’t really about sex — it’s about power.”
“[W]hat’s worth more, the contributions of a lone male genius who assaults and harasses and discriminates against women, or the contributions of a large scientific community unhindered by a misogynistic and unsafe environment?”
The same goes for the tech community… and also for the racist and white supremacist ideologies held within.
Written by Ben Tarnoff on The Guardian.
“We are often sold a similar bill of goods: big tech companies talk incessantly about how ‘AI’ and digitization will bring a better future. In the present tense, however, putting computers everywhere is bad for most people. It enables advertisers, employers and cops to exercise more control over us – in addition to helping heat the planet.”
“Training models isn’t the only way [machine learning] contributes to the cooking of our planet. It has also stimulated a hunger for data that is probably the single biggest driver of the digitization of everything. Corporations and governments now have an incentive to acquire as much data as possible, because that data, with the help of [machine learning], might yield valuable patterns. It might tell them who to fire, who to arrest, when to perform maintenance on a machine or how to promote a new product.”
Written by Elizabeth Joh on Slate.
“Neighborhoods armed with Ring videos, Flock readers, and NextDoor posts have the power to create networked engines of suspicion, sometimes ill-founded or erroneous, that may embolden residents to take actions they should not.”
Written by Privacy International on Privacy International.
“Feeling anxious? Got lucky last night? Having some health issues? Tell Maya and they’ll let Facebook and others know (oh, and they’ll share your diary too!)”
There is a reason why advertisers are so interested in your mood; understanding when a person is in a vulnerable state of mind means you can strategically target them. Knowing when a teenager is feeling low means an advertiser might try and sell them a food supplement that is supposed to make them feel strong and focused. Understanding people’s mood is an entry point for manipulating them. And that is all the more worrying in an age when Facebook is having so much impact on our democracies, as the Cambridge Analytica scandal revealed. Indeed, it is not just advertisers that will want to know how we feel; as elections approach, political parties may want to know if we feel anxious, stressed or excited so that they can adapt their narratives accordingly.
Written by Nicole Chi on Are We Europe.
“[E]ven if not lawless, historically, borders have been vulnerable places for human rights—particularly the right to privacy—as border guards extend government intrusion into our private lives with the authority of upholding national security. Now, data collection and artificial intelligence are threatening to turn borders into an underregulated free-for-all.”
Written by Bryan Menegus on Gizmodo.
“A lot of people think there’s an easy solution, and that the [solution is] for the platforms to ‘do something.’ Social media companies do not have a good history in this arena, and there are so many reasons not to trust these giant companies, why should we trust them to decide what speech is acceptable?”
Written by Joshua Benton on The Atlantic.
“News organizations have multiple and sometimes conflicting incentives that might affect how they present the local police blotter. A company that sells security-optimized doorbells has only one incentive: emphasizing that the world is a scary place, and you need to buy our products to protect you.”