A reading list of articles and other links I use to inform my work at Small Technology Foundation, posted every weekday. Continued from the Ind.ie Radar, and Ind.ie’s Weekly Roundups. Subscribe to the Laura’s Lens RSS feed.
Chinese Hacking Is Alarming. So Are Data Brokers.
Written by Charlie Warzel on New York Times.
“Using the personal data of millions of Americans against their will is certainly alarming. But what’s the difference between the Chinese government stealing all that information and a data broker amassing it legally without user consent and selling it on the open market?”
Mental health websites don't have to sell your data. Most still do.
Written by Privacy International on Privacy International.
“In other words, whenever you visit a number of websites dedicated to mental health to read about depression or take a test, dozens of third-parties may receive this information and bid money to show you a targeted ad. Interestingly, some of these websites seem to include marketing trackers without displaying any ads, meaning they simply allow data collection on their site, which in turn may be used for advanced profiling of their users.
It is highly disturbing that we still have to have to say this, but websites dealing with such sensitive topics should not track their users for marketing purposes. Your mental health is not and should never be for sale.”
I went to add this article to the lens, then saw it goes on to recommend our tracker blocker, Better Blocker. Kismet!
Why you can’t escape dark patterns
Written by Lilly Smith on Fast Company.
“[N]ew research suggests that only 11% of major sites are designing these so-called consent notices to meet the minimum requirements set by law.”
“So are design patterns that prevent the user from making an easy and clear privacy decision examples of simply poor design, or are these design patterns intentionally nudging users to share data?” “It has to be intentional because anyone who’s actually read the GDPR in an honest way would know that it’s not right,” says [David] Carroll. “Both the design and the functionality of them are very manipulative in favor of the first- and third-party collectors where possible.”
“It’s a design problem,” Carroll says, “but it’s a business model problem first and foremost.”
Related: as Tatiana Mac points out, we need to stop using “dark pattern”. For the practices in this particular article, I prefer “deceptive pattern”, “malicious pattern” or “anti-consent pattern” 🙃
How Algorithmic Bias Hurts People With Disabilities
Written by Alexanda Reeve Givens on Slate.
“In hiring, for example, new algorithm-driven tools will identify characteristics shared by a company’s “successful” existing employees, then look for those traits when they evaluate new hires. But as the model treats underrepresented traits as undesired traits to receive less weighting, people with disabilities—like other marginalized groups—risk being excluded as a matter of course.
While some have called to fix this data problem by collecting more detailed information about job candidates’ disabilities, further collection raises its own distinct and very real concerns about privacy and discrimination.
These problems exist for others, too: people who have marginalized sexual orientations or nonbinary gender identities, those who fall outside U.S. definitions of race and ethnicity, and for people who are members of multiple, intersecting marginalized communities.”
Teens have figured out how to mess with Instagram's tracking algorithm
Written by Alfred Ng on CNET.
“These teenagers are relying on a sophisticated network of trusted Instagram users to post content from multiple different devices, from multiple different locations.
Teens shouldn’t have to go to those lengths to socialize privately on Instagram, said Liz O’Sullivan, technology director at the Surveillance Technology Oversight Project. … ‘I love that the younger generation is thinking along these lines, but it bothers me when we have to come up with these strategies to avoid being tracked,’ O’Sullivan said. ‘She shouldn’t have to have these psyop [psychological operations] networks with multiple people working to hide her identity from Instagram.’”
Researchers Find ‘Anonymized’ Data Is Even Less Anonymous Than We Thought
Written by Karl Bode on Motherboard.
“They told Motherboard their tool analyzed thousands of datasets from data scandals ranging from the 2015 hack of Experian, to the hacks and breaches that have plagued services from MyHeritage to porn websites. Despite many of these datasets containing “anonymized” data, the students say that identifying actual users wasn’t all that difficult.
For example, while one company might only store usernames, passwords, email addresses, and other basic account information, another company may have stored information on your browsing or location data. Independently they may not identify you, but collectively they reveal numerous intimate details even your closest friends and family may not know.
The problem is compounded by the fact that the United States still doesn’t have even a basic privacy law for the internet era, thanks in part to relentless lobbying from a cross-industry coalition of corporations eager to keep this profitable status quo intact. As a result, penalties for data breaches and lax security are often too pathetic to drive meaningful change.”
I’m a trans woman. Google Photos doesn’t know how to categorize me
Written by Cara Esten Hustle on Fast Company.
“The same data set that could be used to build a system to prevent showing trans folks photos from before they started transition could be trivially used and weaponized by an authoritarian state to identify trans people from street cameras,” [Penelope] Phippen says.
With this dystopian future in mind, coupled with the fact that federal agencies like ICE already use facial recognition technology for immigration enforcement, do we even want machine learning to piece together a coherent identity from both pre- and post-transition images?
With trans people facing daily harassment simply for existing as ourselves, the stakes seem too high to risk teaching these systems how to recognize us”
This made me think of Tatiana Mac’s brilliant ‘The Banal Binary’ talk at New Adventures conference two weeks ago.
Leaked Documents Expose the Secretive Market for Your Web Browsing Data
Written by Joseph Cox on Motherboard.
“The data obtained by Motherboard and PCMag includes Google searches, lookups of locations and GPS coordinates on Google Maps, people visiting companies’ LinkedIn pages, particular YouTube videos, and people visiting porn websites. It is possible to determine from the collected data what date and time the anonymized user visited YouPorn and PornHub, and in some cases what search term they entered into the porn site and which specific video they watched.”
LK: I read all claims of “anonymised”/“can’t be de-anonymised” with skepticism.
Tinder's New Panic Button Is Sharing Your Data With Ad-Tech Companies
Written by Shoshana Wodinsky on Gizmodo.
““The kinds of people that are gonna be coerced into downloading [the safety app] are exactly the kind of people that are put most at risk by the data that they’re sharing…”
You Are Now Remotely Controlled
Written by Shoshana Zuboff on New York Times.
All of these delusions rest on the most treacherous hallucination of them all: the belief that privacy is private. We have imagined that we can choose our degree of privacy with an individual calculation in which a bit of personal information is traded for valued services — a reasonable quid pro quo.
The lesson is that privacy is public — it is a collective good that is logically and morally inseparable from the values of human autonomy and self-determination upon which privacy depends and without which a democratic society is unimaginable.”