harmless design decision + obvious thing to do by default + sensible trade-off that minimizes support burden for dev teams + obvious thing to do = CVSS 10
There's been a lot of "excitement" from business types about using LLMs to find vulnerabilities in source code.
Naturally, the goal is to cheapen labor and extract more value for themselves. But I do suspect they're going to fall on their own sword with this one.
Not every vulnerability is obvious. Something can look fine and be the Achilles' heel that wrecks your shit. Something can look incredibly dangerous but end up being a NOP.
If people think their stochastic parrot can do a better job than humans, point them at OpenSSL or libgcrypt, and then have someone with relevant experience interrogate the "findings" before you waste the developers' time. I can guarantee almost everything they complain about will be a false positive with even the shallowest scrutiny.
Attackers don't always choose the same targets that you think are highest value.
Why attack your hardened authentication gateway when the marketing team has an unpatched WordPress 3.1 blog sitting right there talking to your production MySQL database?
On the topic of "political problems", sometimes you have attack vectors that nobody in the C-Suite considers an attack vector, like the FBI demanding the BitLocker disk encryption keys for Windows users.
What if someone demands their data be stricken from the transparency log, as is their right under the EU's GDPR?
What if someone legitimately loses access to all their secret keys (catastrophic hardware failure) and wants their instance admin to be able to restore their ability to use E2EE?
When I say "annoying problems", I do NOT mean the people that would experience them are annoying.
I mean they are annoying because they are fundamentally incompatible with the simplest possible solution.
Anyway, I'm going to log off fedi and get back to group therapy for abused programming languages work.
If this pops off while I'm afk:
Hi, I'm Soatok, a gay furry cryptography nerd. I blog at https://soatok.blog and once led a charge to fund a library because the local mayor is a bigot and wanted to illegally withhold their funding.
Power users (esp. the kinds of people governments would target, such as journalists, activists, and whistleblowers) can become immune to BurnDown. But if they lose their keys, they're SOL.
Everyone else can recover access by having their instance admin issue a BurnDown and starting over with a fresh keypair.
Passing the Mud Puddle Test was important to me, but being usable by real people who aren't 100% perfectly disciplined all the time is even more important.
With respect to the @dymaxion toot I quote frequently:
There are myriad political problems surrounding the development and adoption of cryptography tooling.
Is this the right balance for everyone?
Probably not. I anticipate someone will write an E2EE client someday that forces users to be Fireproof with PKDs (and refuses to chat with anyone that isn't), even to their own detriment, and some folks on Hacker News will cargo-cult that as the only secure client to use for E2EE on Fedi. And then I will have a headache to deal with.
I also anticipate some governments considering using GDPR-like takedown demands to cover up their own crimes against their citizens. (Mitigating that might require operators having the sensibility to back up the keys they're shredding in their online service and have their lawyers hold onto it.)
But at the end of the day, what I'm building are merely tools, not panaceas.
Doing the security standards for a large company subject to thousands upon thousands of laws and regulations, quite literally 84% of my time was spent on writing justifications, explaining mitigations, and meetings with deeply non-technical people.
5% was actually implementing technical measures.
Less than 30 minutes of it was addressing actual technical security issues.
@soatok > cheapen labor and extract more value for themselves
Which is still why I argue we need liability for software bugs that is persecuted ex officio; it can't be that in 2026 companies are still hiding behind the “software just does that *shrug*” excuse for bugs and security issues.
@soatok this is why I refuse to use the platform passkeys on Android.
If you don't let me disable sync the security guarantees are gone. Encrypting the passkey with my screen lock pattern is laughable since the only reason that pattern is adequate for unlocking the phone itself is relying on the TPM to rate-limit unlock attempts.
@lunareclipse@soatok During the development of this, we lobbied google and apple to both allow the user to choose if keys should be synced or device only. Both flatout denied this and said "why would you want device only?!" Despite us listing a lot of possible cases eg dv. And so now here we are 🙃