And the posts, they keep on coming. I hundred percent agree with @filippo here, the question is not whether we're certain that a quantum computer exists by 2029, it's whether we're certain that one doesn't exist. And things have progressed far enough that non-physicists, or even physicists working in different subfields, can no longer reliably tell what's going on.
Last time I had a 10+ hour flight, Opal nerd sniped me into figuring out how to break ML-DSA keys that had been improperly encrypted with a reused IV. (To be perfectly clear, this is not an issue with ML-DSA, but with reused IVs. Nothing is secure in that case, but some things are insecure in interesting ways)
So of course, @filippo , being present when I disclosed that vulnerability, chose to immediately exploit it by nerd sniping me into providing additional test vectors for ML-DSA for this flight.
@soatok@whitequark yeah, this is the rare compiler w for constant time programming, that has saved some Kyber implementations. Since multiplication and shifts are so much cheaper than integer division, this is more or less the standard behavior if the compiler knows the divisor. But of course, you can't rely on it. And theoretically, the compiler is allowed to take your manual Barrett code and replace it with idiv as well, if it sees so fit.
@wordshaper@weekend_editor@Green_Footballs@cstross starting by the fact that being resistant to a specific disease does not necessarily produce any other positive side effects, and in fact is more likely to negatively impact fitness when the disease is not a threat. See for example sickle cell anemia and malaria.
@soatok independent of that logic error, looking at the code it also has a fundamentally flawed design that assumes that signatures can be verified via an equality check. It also trusts the token with algorithm selection and has a timing side channel.
Me: if I was an attacker and had a quantum computer right now, CA root certs would certainly be my first target. Colleague: come on, no Bitcoin for me? Me: fine, after I stole a bunch of Bitcoin and distributed them among the people in this video call, CA root certs would be my next target.
@paul_ipv6@inthehands honestly, I've been wondering that for a while. If there is a masked, unidentified person abducting people in broad daylight, isn't it supposedly the police's job to stop them? I mean could be anyone, without a badge we can't know for sure, after all.
(And I know, expecting the police to actually do their job instead of committing crimes themselves is a tall order, but still)
@neverpanic@soatok tbf, the PQC specs came out about a year ago, and FIPS takes about a year. We'll see a lot more FIPS validated implementations across the board in 2026.
When it comes to crypto agility, have you tried Tink (https://developers.google.com/tink)? It is IMHO the far superior way to solve this issue (full disclosure, it is developed by my team, so I'm extremely biased)
@simo5@neverpanic@soatok this is one of my pet peeve rants: we have signature formats. (Several of them, even). Cryptographic standards define functions that map a collection of byte strings to some other byte strings. Files can store byte strings. What we are somewhat lacking is fully specified public key formats, but even that we have some (Tink defines it's own and can read/write many of the existing formats). The signature should just be the byte string given as the output of the signing algorithm. It's the public key that needs the information for verifying the signature.
So if I give you a public key (including a definition of the full algorithm used, all the hash functions and security parameters etc), then you can verify a signature.
If you want crypto agility, then the thing you need is support for key sets, i.e. multiple, equally trusted keys. That allows you to add, promote, and delete keys in a distributed environment. You have two options for the signature format in this case (and Tink supports both): either you keep the unmodified signature, try all public keys and call the signature verified if it verified under one of the public keys (great for comparability), or you put a short and meaningless identifier in front of the signature, which allows you to directly jump to the right public key. Better performance, but not compatible with libraries that don't support key sets in the same way.
In both cases, this composite algorithm retains EUF-CMA/SUF-CMA as long as all keys in the key set are trusted and have EUF-CMA/SUF-CMA.
Interestingly, pretty much all other types of signature formats, such as JWT (and as far as I know PGP) violate EUF-CMA and definitely violate SUF-CMA, so I argue (fairly strongly, given all the attacks due to these violations), that the Tink way of supporting key sets and signatures is the correct approach.