A reading list of articles and other links I use to inform my work at Small Technology Foundation, posted every weekday. Continued from the Ind.ie Radar, and Ind.ie’s Weekly Roundups. Subscribe to the Laura’s Lens RSS feed.
Facebook didn’t mark ads as ads for blind people for almost 2 years
Written by Jeremy B. Merrill on Quartz.
“Americans with disabilities should not be an afterthought for tech companies. There is no justification for forcing them to spend extra time and effort to navigate past online ads,” said Wyden, a Democrat from Oregon. And they should be able to easily learn why they were targeted by those ads, just like everyone else.”
“Not including legible labels on ads “certainly violates the spirit if not the letter of the ADA [Americans with Disabilities Act] and raises questions about whether Facebook is engaging in deceptive practices under the FTC Act,” said Blake Reid, a law professor at the University of Colorado, Boulder who studies accessibility and technology law.
A Letter from the President (at The Markup)
Written by Nabiha Syed on The Markup.
“You also deserve to hear these facts from an independent source. We want to investigate the ecosystem of data exploitation, and we don’t think we can do that while shackled to it. And so we make a privacy promise to you, our readers: We will not track you. Unlike many companies, we put your privacy first. We collect the minimum amount of data possible when you visit our site, and we will never monetize this data. We won’t display advertisements on our site, because they too often contain tracking technology. This makes our work more complicated and more expensive—but your privacy is worth it.”
A media organisation that’s leading on privacy. This is SO COOL.
Chrome is ditching third-party cookies because Google wants your data all to itself
Written by Maya Shwayder on Digital Trends.
“They’re not really changing underlying tactics [of how they track us], they’re just channeling it all through Google,” [Elizabeth] Renieris told Digital Trends.
“At least we knew how cookies worked. Instead, Google will shore up its surveillance power with even less oversight and accountability, black-boxed behind its proprietary technology. Not good news at all.”[— Christopher Chan]
This means Google will now have full functional, filled out profiles on every single movement and purchase that every one of its billions of users makes across the internet.
Google’s decision to shift control of UK user data to the US looks like a calculated political bet that Brexit will be a privacy disaster
Written by Isobel Asher Hamilton on Business Insider.
“UK users remain protected by Europe’s strict privacy rules for now, even if their data is legally controlled by a US entity. It does, however, raise the specter of reduced privacy in future if a post-Brexit UK alters its laws to become less privacy-oriented.”
“That political context is critical to understanding Google’s decision. This is not the action of a company which believes the UK will secure an adequacy agreement or intends to continue aligning itself with the European data protection framework and its user rights. They are moving fast on that belief, and it’s safe to say they are not engaging in this work out of a concern for UK citizens’ human rights,” said [Heather] Burns.
"Google mentioning law enforcement at all in the Reuters announcement was a bit of a red herring, in other words, to distract from the everyday user data at stake,” Burns added.
Every tech policy article needs Heather Burns doing bullshit detection.
Chinese Hacking Is Alarming. So Are Data Brokers.
Written by Charlie Warzel on New York Times.
“Using the personal data of millions of Americans against their will is certainly alarming. But what’s the difference between the Chinese government stealing all that information and a data broker amassing it legally without user consent and selling it on the open market?”
Mental health websites don't have to sell your data. Most still do.
Written by Privacy International on Privacy International.
“In other words, whenever you visit a number of websites dedicated to mental health to read about depression or take a test, dozens of third-parties may receive this information and bid money to show you a targeted ad. Interestingly, some of these websites seem to include marketing trackers without displaying any ads, meaning they simply allow data collection on their site, which in turn may be used for advanced profiling of their users.
It is highly disturbing that we still have to have to say this, but websites dealing with such sensitive topics should not track their users for marketing purposes. Your mental health is not and should never be for sale.”
I went to add this article to the lens, then saw it goes on to recommend our tracker blocker, Better Blocker. Kismet!
Why you can’t escape dark patterns
Written by Lilly Smith on Fast Company.
“[N]ew research suggests that only 11% of major sites are designing these so-called consent notices to meet the minimum requirements set by law.”
“So are design patterns that prevent the user from making an easy and clear privacy decision examples of simply poor design, or are these design patterns intentionally nudging users to share data?” “It has to be intentional because anyone who’s actually read the GDPR in an honest way would know that it’s not right,” says [David] Carroll. “Both the design and the functionality of them are very manipulative in favor of the first- and third-party collectors where possible.”
“It’s a design problem,” Carroll says, “but it’s a business model problem first and foremost.”
Related: as Tatiana Mac points out, we need to stop using “dark pattern”. For the practices in this particular article, I prefer “deceptive pattern”, “malicious pattern” or “anti-consent pattern” 🙃
How Algorithmic Bias Hurts People With Disabilities
Written by Alexanda Reeve Givens on Slate.
“In hiring, for example, new algorithm-driven tools will identify characteristics shared by a company’s “successful” existing employees, then look for those traits when they evaluate new hires. But as the model treats underrepresented traits as undesired traits to receive less weighting, people with disabilities—like other marginalized groups—risk being excluded as a matter of course.
While some have called to fix this data problem by collecting more detailed information about job candidates’ disabilities, further collection raises its own distinct and very real concerns about privacy and discrimination.
These problems exist for others, too: people who have marginalized sexual orientations or nonbinary gender identities, those who fall outside U.S. definitions of race and ethnicity, and for people who are members of multiple, intersecting marginalized communities.”
Teens have figured out how to mess with Instagram's tracking algorithm
Written by Alfred Ng on CNET.
“These teenagers are relying on a sophisticated network of trusted Instagram users to post content from multiple different devices, from multiple different locations.
Teens shouldn’t have to go to those lengths to socialize privately on Instagram, said Liz O’Sullivan, technology director at the Surveillance Technology Oversight Project. … ‘I love that the younger generation is thinking along these lines, but it bothers me when we have to come up with these strategies to avoid being tracked,’ O’Sullivan said. ‘She shouldn’t have to have these psyop [psychological operations] networks with multiple people working to hide her identity from Instagram.’”
Researchers Find ‘Anonymized’ Data Is Even Less Anonymous Than We Thought
Written by Karl Bode on Motherboard.
“They told Motherboard their tool analyzed thousands of datasets from data scandals ranging from the 2015 hack of Experian, to the hacks and breaches that have plagued services from MyHeritage to porn websites. Despite many of these datasets containing “anonymized” data, the students say that identifying actual users wasn’t all that difficult.
For example, while one company might only store usernames, passwords, email addresses, and other basic account information, another company may have stored information on your browsing or location data. Independently they may not identify you, but collectively they reveal numerous intimate details even your closest friends and family may not know.
The problem is compounded by the fact that the United States still doesn’t have even a basic privacy law for the internet era, thanks in part to relentless lobbying from a cross-industry coalition of corporations eager to keep this profitable status quo intact. As a result, penalties for data breaches and lax security are often too pathetic to drive meaningful change.”