Stop Letting Google Get Away With It
Written by Shoshana Wodinsky on Gizmodo.
“Like the majority of Google’s privacy pushes that we’ve seen until now, the FLoC proposal isn’t as user-friendly as you might think. For one thing, others have already pointed out that this proposal doesn’t necessarily stop people from being tracked across the web, it just ensures that Google’s the only one doing it.”
22 September 2020 14:22 UTC
This is really useful if you want to check your own site or even point out privacy issues with your organisation’s site to your boss.
(Devs:) it’s worth reading the writeup on what’s being checked: How we built a real-time privacy inspector.
Today the Smashing Podcast came out with its 13th episode, featuring me talking to Drew McLellan about privacy.
A Letter from the President (at The Markup)
Written by Nabiha Syed on The Markup.
“You also deserve to hear these facts from an independent source. We want to investigate the ecosystem of data exploitation, and we don’t think we can do that while shackled to it. And so we make a privacy promise to you, our readers: We will not track you. Unlike many companies, we put your privacy first. We collect the minimum amount of data possible when you visit our site, and we will never monetize this data. We won’t display advertisements on our site, because they too often contain tracking technology. This makes our work more complicated and more expensive—but your privacy is worth it.”
A media organisation that’s leading on privacy. This is SO COOL.
Google’s decision to shift control of UK user data to the US looks like a calculated political bet that Brexit will be a privacy disaster
Written by Isobel Asher Hamilton on Business Insider.
“UK users remain protected by Europe’s strict privacy rules for now, even if their data is legally controlled by a US entity. It does, however, raise the specter of reduced privacy in future if a post-Brexit UK alters its laws to become less privacy-oriented.”
“That political context is critical to understanding Google’s decision. This is not the action of a company which believes the UK will secure an adequacy agreement or intends to continue aligning itself with the European data protection framework and its user rights. They are moving fast on that belief, and it’s safe to say they are not engaging in this work out of a concern for UK citizens’ human rights,” said [Heather] Burns.
“Google mentioning law enforcement at all in the Reuters announcement was a bit of a red herring, in other words, to distract from the everyday user data at stake,” Burns added.
Every tech policy article needs Heather Burns doing bullshit detection.
Chinese Hacking Is Alarming. So Are Data Brokers.
Written by Charlie Warzel on New York Times.
“Using the personal data of millions of Americans against their will is certainly alarming. But what’s the difference between the Chinese government stealing all that information and a data broker amassing it legally without user consent and selling it on the open market?”
Mental health websites don't have to sell your data. Most still do.
Written by Privacy International on Privacy International.
“In other words, whenever you visit a number of websites dedicated to mental health to read about depression or take a test, dozens of third-parties may receive this information and bid money to show you a targeted ad. Interestingly, some of these websites seem to include marketing trackers without displaying any ads, meaning they simply allow data collection on their site, which in turn may be used for advanced profiling of their users.
It is highly disturbing that we still have to have to say this, but websites dealing with such sensitive topics should not track their users for marketing purposes. Your mental health is not and should never be for sale.”
I went to add this article to the lens, then saw it goes on to recommend our tracker blocker, Better Blocker. Kismet!
Teens have figured out how to mess with Instagram's tracking algorithm
Written by Alfred Ng on CNET.
“These teenagers are relying on a sophisticated network of trusted Instagram users to post content from multiple different devices, from multiple different locations.
Teens shouldn’t have to go to those lengths to socialize privately on Instagram, said Liz O’Sullivan, technology director at the Surveillance Technology Oversight Project. … ‘I love that the younger generation is thinking along these lines, but it bothers me when we have to come up with these strategies to avoid being tracked,’ O’Sullivan said. ‘She shouldn’t have to have these psyop [psychological operations] networks with multiple people working to hide her identity from Instagram.’”
Researchers Find ‘Anonymized’ Data Is Even Less Anonymous Than We Thought
Written by Karl Bode on Motherboard.
“They told Motherboard their tool analyzed thousands of datasets from data scandals ranging from the 2015 hack of Experian, to the hacks and breaches that have plagued services from MyHeritage to porn websites. Despite many of these datasets containing “anonymized” data, the students say that identifying actual users wasn’t all that difficult.
For example, while one company might only store usernames, passwords, email addresses, and other basic account information, another company may have stored information on your browsing or location data. Independently they may not identify you, but collectively they reveal numerous intimate details even your closest friends and family may not know.
The problem is compounded by the fact that the United States still doesn’t have even a basic privacy law for the internet era, thanks in part to relentless lobbying from a cross-industry coalition of corporations eager to keep this profitable status quo intact. As a result, penalties for data breaches and lax security are often too pathetic to drive meaningful change.”
You Are Now Remotely Controlled
Written by Shoshana Zuboff on New York Times.
All of these delusions rest on the most treacherous hallucination of them all: the belief that privacy is private. We have imagined that we can choose our degree of privacy with an individual calculation in which a bit of personal information is traded for valued services — a reasonable quid pro quo.
The lesson is that privacy is public — it is a collective good that is logically and morally inseparable from the values of human autonomy and self-determination upon which privacy depends and without which a democratic society is unimaginable.”