Tag: Google
-
Google’s Quest to Kill the Cookie Is Creating a Privacy Shitshow
Written by Shoshana Wodinsky on Gizmodo.
“Digiday reported this week that some major players in the adtech industry have started drawing up plans to turn FLoC into something just as invasive as the cookies it’s supposed to quash. In some cases, this means companies amalgamating any data scraps they can get from Google with their own catalogs of user info, turning FLoC from an ”anonymous” identifier into just another piece of personal data for shady companies to compile. Others have begun pitching FLoC as a great tool for fingerprinting—an especially underhanded tracking technique that can keep pinpointing you no matter how many times you go incognito or flush your cache.
…
[W]hat if that guy regularly visits websites centered around queer or trans topics? What if he’s trying to get access to food stamps online? This kind of web browsing—just like all web browsing—gets slurped into FLoC’s algorithm, potentially tipping off countless obscure adtech operators about a person’s sexuality or financial situation. And because the world of data sharing is still a (mostly) lawless wasteland in spite of lawmaker’s best intentions, there’s not much stopping a DSP from passing off that data to the highest bidder.”
Read ‘Google’s Quest to Kill the Cookie Is Creating a Privacy Shitshow’ on the Gizmodo site.
-
What Really Happened When Google Ousted Timnit Gebru
Written by Tom Simonite on Wired.
A very long read but a fascinating insight into how ethics research works (more specifically, doesn’t) inside Google, which I imagine can be extrapolated to other corporations.
“Gebru’s career mirrored the rapid rise of AI fairness research, and also some of its paradoxes. Almost as soon as the field sprang up, it quickly attracted eager support from giants like Google, which sponsored conferences, handed out grants, and hired the domain’s most prominent experts. Now Gebru’s sudden ejection made her and others wonder if this research, in its domesticated form, had always been doomed to a short leash. To researchers, it sent a dangerous message: AI is largely unregulated and only getting more powerful and ubiquitous, and insiders who are forthright in studying its social harms do so at the risk of exile.
…
To some, the drama at Google suggested that researchers on corporate payrolls should be subject to different rules than those from institutions not seeking to profit from AI. In April, some founding editors of a new journal of AI ethics published a paper calling for industry researchers to disclose who vetted their work and how, and for whistle-blowing mechanisms to be set up inside corporate labs. ‘We had been trying to poke on this issue already, but when Timnit got fired it catapulted into a more mainstream conversation,’ says Savannah Thais, a researcher at Princeton on the journal’s board who contributed to the paper. ‘Now a lot more people are questioning: Is it possible to do good ethics research in a corporate AI setting?’
If that mindset takes hold, in-house ethical AI research may forever be held in suspicion—much the way industrial research on pollution is viewed by environmental scientists.
…
Inioluwa Deborah Raji, whom Gebru escorted to Black in AI in 2017, and who now works as a fellow at the Mozilla Foundation, says that Google’s treatment of its own researchers demands a permanent shift in perceptions. ‘There was this hope that some level of self-regulation could have happened at these tech companies,’ Raji says. ‘Everyone’s now aware that the true accountability needs to come from the outside—if you’re on the inside, there’s a limit to how much you can protect people.’
…
[Gebru]’s been thinking back to conversations she’d had with a friend who warned her not to join Google, saying it was harmful to women and impossible to change. Gebru had disagreed, claiming she could nudge things, just a little, toward a more beneficial path. ‘I kept on arguing with her,’ Gebru says. Now, she says, she concedes the point.
Read ‘What Really Happened When Google Ousted Timnit Gebru’ on the Wired site.
-
Stop Letting Google Get Away With It
Written by Shoshana Wodinsky on Gizmodo.
“Like the majority of Google’s privacy pushes that we’ve seen until now, the FLoC proposal isn’t as user-friendly as you might think. For one thing, others have already pointed out that this proposal doesn’t necessarily stop people from being tracked across the web, it just ensures that Google’s the only one doing it.”
Read ‘Stop Letting Google Get Away With It’ on the Gizmodo site.
-
Google’s Top Search Result? Surprise! It’s Google
Written by Adrianne Jeffries and Leon Yin on The Markup.
“We examined more than 15,000 recent popular queries and found that Google devoted 41 percent of the first page of search results on mobile devices to its own properties and what it calls ‘direct answers,’ which are populated with information copied from other sources, sometimes without their knowledge or consent.”
…
Cummings, of SpanishDict.com, said something similar. “Google delivers the traffic for the whole internet. Unless your name is Facebook, you rely on Google,” he said. “It’s very risky to speak out at Google because you don’t know what type of retaliation you’ll face.”
Read ‘Google’s Top Search Result? Surprise! It’s Google’ on the The Markup site.
-
OK Google, Black History Month Is Over. What Now?
Written by Gabrielle Rejouis on OneZero.
“Despite the benefits Google has received from the Black community, the company has refused to or has been slow to correct the discriminatory algorithmic practices at YouTube, such as its language filter, ads, and its search algorithms. Whether intentional or unconscious, all of these biases have harmed the Black community. For some people, Google is the internet. Civil rights considerations must be central to big data and the platforms they drive. Google should not celebrate the contributions of Black people without also making their platforms welcoming to them.
…
Technology will not be the silver bullet solving the problem of content moderation. Neither will sensitivity training nor diverse hiring. Dismantling these structures will require racial literacy and more multifaceted changes.
…
To say the internet has a huge impact on our society is an understatement. And the data and privacy missteps committed by Big Tech disproportionately affect historically marginalized communities.”
Read ‘OK Google, Black History Month Is Over. What Now?’ on the OneZero site.
Tagged with: Google, discrimination, algorithm.
-
Chrome is ditching third-party cookies because Google wants your data all to itself
Written by Maya Shwayder on Digital Trends.
“They’re not really changing underlying tactics [of how they track us], they’re just channeling it all through Google,” [Elizabeth] Renieris told Digital Trends.
…
“At least we knew how cookies worked. Instead, Google will shore up its surveillance power with even less oversight and accountability, black-boxed behind its proprietary technology. Not good news at all.”[— Christopher Chan]
…
This means Google will now have full functional, filled out profiles on every single movement and purchase that every one of its billions of users makes across the internet.
-
Google’s decision to shift control of UK user data to the US looks like a calculated political bet that Brexit will be a privacy disaster
Written by Isobel Asher Hamilton on Business Insider.
“UK users remain protected by Europe’s strict privacy rules for now, even if their data is legally controlled by a US entity. It does, however, raise the specter of reduced privacy in future if a post-Brexit UK alters its laws to become less privacy-oriented.”
…
“That political context is critical to understanding Google’s decision. This is not the action of a company which believes the UK will secure an adequacy agreement or intends to continue aligning itself with the European data protection framework and its user rights. They are moving fast on that belief, and it’s safe to say they are not engaging in this work out of a concern for UK citizens’ human rights,” said [Heather] Burns.
…
“Google mentioning law enforcement at all in the Reuters announcement was a bit of a red herring, in other words, to distract from the everyday user data at stake,” Burns added.
Every tech policy article needs Heather Burns doing bullshit detection.
-
Google’s Acquisition of Fitbit Has Implications for Health and Fitness Data
Written by Nicole Lindsey on CPO Magazine.
“Even if the Silicon Valley tech giant doesn’t plan to use that health and fitness data to show you ads, you can rest assured that Google has plenty of other uses for that data.”
-
Facebook and Google’s pervasive surveillance poses an unprecedented danger to human rights
Written by Amnesty International/Kumi Naidoo on Amnesty International.
“Surveillance Giants lays out how the surveillance-based business model of Facebook and Google is inherently incompatible with the right to privacy and poses a systemic threat to a range of other rights including freedom of opinion and expression, freedom of thought, and the right to equality and non-discrimination.
…
The tech giants offer these services to billions without charging users a fee. Instead, individuals pay for the services with their intimate personal data, being constantly tracked across the web and in the physical world as well, for example, through connected devices.
…
The technology behind the internet is not incompatible with our rights, but the business model Facebook and Google have chosen is”
Tagged with: Facebook, Google, human rights.
-
Tracking Phones, Google Is a Dragnet for the Police
Written by Jennifer Valentino-DeVries on NYTimes.
“Technology companies have for years responded to court orders for specific users’ information. The new warrants go further, suggesting possible suspects and witnesses in the absence of other clues. Often, Google employees said, the company responds to a single warrant with location information on dozens or hundreds of devices.”
“The technique illustrates a phenomenon privacy advocates have long referred to as the “if you build it, they will come” principle — anytime a technology company creates a system that could be used in surveillance, law enforcement inevitably comes knocking. Sensorvault, according to Google employees, includes detailed location records involving at least hundreds of millions of devices worldwide and dating back nearly a decade.”
Read ‘Tracking Phones, Google Is a Dragnet for the Police’ on the NYTimes site.
Tagged with: Google, surveillance, dragnet.