@rysiek It seems that Cloudflare has only 6 data centers in Germany. There is a single data center in all of North Rhine-Westphalia with its 18 million people. Yes, this isn’t exactly impressive position pinpointing.
I guess somebody on the run who doesn’t want to disclose which country they are in would be concerned about this issue. Then again, they probably wouldn’t want to expose their real IP address to the Signal infrastructure in the first place.
A bunch of years ago I recommended against the use of the Session messenger (a Signal fork) but that wasn’t due to its technical merits. I found it concerning what kind of audience that messenger addresses. If the app is geared towards white nationalists, sexists and the like, then nobody else should help improve its image with their presence. Mind you, that was a long time ago and I don’t know whether they’ve improved.
But @soatok took apart their cryptographic approach now and… well, I better just quote him:
“run, screaming, in the other direction from Session.”
This details many of Google’s shortcoming at keeping Chrome Web Store safe, with the conclusion: “for the end users the result is a huge (and rather dangerous) mess.”
I am explaining how Google handled (or rather didn’t handle for most part) my recent reports. How they make reporting problematic extensions extremely hard and then keep reporters in the dark about the state of these reports. How Google repeatedly chose to ignore their own policies and allowed shady, spammy and sometimes outright malicious extensions to prevail.
There is some text here on the completely meaningless “Featured” badge that is more likely to be awarded to malicious extensions than to legitimate ones. And how user reviews aren’t allowing informed decisions either because Google will allow even the most obvious fakes to remain.
This post provides more details on BIScience Ltd., another company selling browsing data of extension users. @tuckner and I wrote a bit about that one recently, but this has been going on since at least 2019 apparently. Google allows it as long as extension authors claim (not very convincingly) that this data collection is necessary for the extension’s functionality. It’s not that Google doesn’t have policies that would prohibit it, yet Google chooses not to enforce those.
@thisismissem Difficult. If we spin this analogy further: you gave me your key for a specific purpose (e.g. pizza delivery while you were out), after which I returned it to you. You didn’t allow me to make a copy of this key and use it later to rearrange the furniture for example.
Abusing hardcoded credentials can definitely constitute hacking and cause perfectly justified criminal charges. But intention and damage caused definitely need to go into the equation, not merely “circumvention of protection mechanisms.”
German law is making security research a risky business.
Current news: A court found a developer guilty of “hacking.” His crime: he was tasked with looking into a software that produced way too many log messages. And he discovered that this software was making a MySQL connection to the vendor’s database server.
When he checked that MySQL connection, he realized that the database contained data belonging to not merely his client but all of the vendor’s customers. So he immediately informed the vendor – and while they fixed this vulnerability they also pressed charges.
There was apparently considerable discussion as to whether hardcoding database credentials in the application (visible as plain text, not even decompiling required) is sufficient protection to justify hacking charges. But the court ruling says: yes, there was a password, so there is a protection mechanism which was circumvented, and that’s hacking.
I very much hope that there will be a next instance ruling overturning this decision again. But it’s exactly as people feared: no matter how flawed the supposed “protection,” its mere existence turns security research into criminal hacking under the German law. This has a chilling effect on legitimate research, allowing companies to get away with inadequate security and in the end endangering users.
For a brief moment there I thought that Google finally decided that event listeners added by extension’s content scripts should not receive synthetic (untrusted) events by default. I mean, we’ve had that in Firefox at least a decade ago. Quite a showstopper for exploiting extension vulnerabilities.
The Ubuntu debacle just shows that quality of localization is an underappreciated problem. For most projects, there are only a handful languages with enough contributors to catch the most glaring issues. And what do you do about the rest of them?
Even without malicious contributors, translation issues are common. There are too literal translations, translations missing the context and your regular translation mistakes. But I’ve also seen bogus automated translations being submitted way too often.
And that isn’t only an issue with open source projects that rely on volunteer contributors. Some of the worst translations I’ve seen came from translation agencies, even those promising to have translation checks in place. Presumably, they pay employees for quantity, not quality. And bad translations are rarely noticed, so there are no consequences.
Back in the day I’ve been juggling 40+ languages, reviewing changes and attempting to recognize translation issues without speaking the language. It was a time-consuming and complicated job. I didn’t like doing it, but at least I would definitely have recognized malicious submissions like the ones Ubuntu tripped over.
Most projects barely review translations or skip reviews completely. Instead, they rely on end users to report issues, which almost never happens. Worse yet: it is very typical to allow HTML injection via translations, so malicious translations can cause real security trouble.
We are currently witnessing the fallout from monopolization in the browser space. Back in 2007, Internet Explorer received much criticism for its phishing protection mechanism which transmitted all visited websites to Microsoft servers. Mozilla paired up with Google and designed a different system which performed most checks locally and preserved users’ privacy. That’s what healthy competition looks like.
Fast forward to 2023. Almost all web browsers in use are either Chrome or based on the Chromium browser engine. With the competition pretty much eliminated, Google is now pushing its “Enhanced Safe Browsing” down everyone’s throats – which is a nice sounding name for “every website you visit is sent to our servers.” The Internet Explorer approach from 2007 all over again, only that now it’s Google getting all this data. And they certainly won’t do anything evil with it. Yeah, sure.
Reminder: Firefox and Safari are the only remaining browsers worth noting which are not using Google’s browser engine.
Many people seem still unaware of just how bad Chrome Sync is for your privacy. By default, Chrome will sync all your data – including e.g. your passwords, bookmarks, browsing history and open tabs. And by default, Chrome will not encrypt any of this data. All of it will be accessible by Google, by anyone who subpoenas Google to turn up your data and whoever else managed to get access to these servers.
If you want this data encrypted before it is first uploaded, you need to click “Settings” instead of confirming sync, then expand “Encryption options” and set up a sync passphrase. The default option “Encrypt synced passwords with your Google Account” is essentially a disguised “We can access all your data but we promise not to look. Don’t you trust us?”
The only positive aspect here: Chrome Sync used to be a lot worse. It used to enable automatically when you signed into Chrome. It used to encrypt only passwords and none of the other data even if you set up a passphrase. It used to warn you when setting a passphrase because Google’s web services would no longer be able to access your passwords. It used to upload data without encryption first, only allowing to enable encryption after the fact. And its encryption used to be horribly broken. I wrote about that five years ago: https://palant.info/2018/03/13/can-chrome-sync-or-firefox-sync-be-trusted-with-sensitive-data/#chrome-sync
But even now, Chrome Sync requires you to take action in order to get privacy. Because Google knows that you won’t. Compare that to Firefox Sync which has always been encrypting all data by default. I criticized the implementation here as well, but that was really a minor issue compared to the mess which is Chrome Sync.
Edit: Removed link to a post claiming that Google is censoring synced bookmarks. This claim appears to be incorrect, the message there referring to a different Google service.