@soviut only ~half of TLDs tho. It rejects 773 valid TLDs.
As the owner of goof@img.horse, I'm outraged.
@soviut only ~half of TLDs tho. It rejects 773 valid TLDs.
As the owner of goof@img.horse, I'm outraged.
SIGBOIVK 2025 [PDF, p170]: https://sigbovik.org/2025/proceedings.pdf
`ccdoom` is a standards-compliant C23 C compiler that has "program-agnostic compilation model" and "advanced whole-program dead-code elimination" that always outputs doom.exe.
> ccdoom adopts a more user-centric approach to safety: the output contains significantly more monsters than the output of most C compilers, but the user is provided sufficient ammunition to defeat them.
@amszmidt Really? I did not expect them to do worse than C64.
Anyway, my point was that CPU clock speeds used to be close to DRAM speeds. Until mid 80s CPUs didn't even need to have caches.
@artemissian LISP is pointer-heavy. When LISP and LISP machines were hot, RAM access took exactly 1 CPU cycle. Now it takes ~300 cycles. Maybe some purely-functional flavor could be made to work, but that's beyond my imagination.
The current state favors data-centric embarrassingly parallel programs with minimal pointer indirections.
What if C isn't portable, only non-C-compatible architectures went extinct?
I'm half joking, but:
VLIW/EPIC architectures are dead, despite CPUs desperately needing instruction-level parallelism.
Instead of SIMT we have hyperthreading at home, and bug-prone threads with context switching in software.
Instead of hierarchical memory, we waste 8 bytes on all pointers & emulate thread-local memory in software. Larabee was DOA & SIMD barely exists. MOS6502-style stack+registers are only on GPUs.
In 2003, Apple released a 64-bit dual-core 1.8Ghz system: Power Mac G5.
In 2023, Apple released a 64-bit dual-core 1.8Ghz system: Apple Watch Series 9.
The Watch is faster and has more RAM.
The G5 was too hot to put in a laptop. It'd use up S9's battery in under 2 minutes.
@mattly @stonebear I'm just talking about 2FA. It's perfectly reasonable to require 2FA on all accounts. It's safer to err on the side of requiring unimportant accounts to have 2FA, than risking an important user to have an account compromised.
That is entirely orthogonal to the funding structure. The risk and responsibility exists due to code sharing and trust structures, regardless whether people are paid for it or not.
On Star Trek they'd require you to have 2FA too.
@stonebear @mattly To me account security in shared environments is like hygiene. When one person's security stinks, it affects others. To me the real rudeness is in doubling down on bad hygiene when told that your security stinks.
Supply chain security in OSS is already a hot mess, and doesn't need even more worrying about impersonation just because someone *wants* to have poorer security to show a computer who's the boss.
@mattly I know the point – you don't think your account is important & don't want an automated check to tell you what to do.
I just think you're a crybaby about it.
GitHub accounts are used for lots of things, also outside of GH (oauth). GH has no way of knowing how much damage takeover of your account could do (including social engineering if you're a trusted person).
It makes sense for the entire OSS ecosystem for GH to be 2FA-only. It's already a house of cards and doesn't need weak links.
@mattly Get a Yubikey (U2F/Webauthn). It's super convenient to use: makes 2FA a quick tap. It's worth getting one anyway for all your accounts, as it's automatically phishing-proof. Instead of being contrarian you can solve the problem well.
For a programming language that is definitely not a religion, this looks suspiciously like a church:
Firefox will reconsider supporting JPEG XL if they get a Rust implementation:
https://github.com/mozilla/standards-positions/pull/1064
This is a very good news for web standards:
https://mastodon.social/@kornel/113078862354601952
and will fix a blocker that is hurting adoption of JPEG XL.
The reference implementation has unfortunately been written in C++ just as browser vendors started looking into migrating away from C++ for security reasons, and saw the C++ codec primarily as a big new attack surface.
Who could have guessed that a plastic recycling method promoted by Exxon requires 9 times more new fossil fuels than the amount of plastic it manages to successfully recycle?
And the US allows creative accounting that ends up calling that 100% recycled plastic, by counting byproducts like diesel as "recycled" "plastic".
https://www.propublica.org/article/delusion-advanced-chemical-plastic-recycling-pyrolysis
@lanodan For example, it can't tell you "hey, you need a mutex here".
Rust can, during normal compilation, not even as a separate analyzer. And not only in function-local obvious patterns, but across many levels of indirection, even callbacks spanning 3rd party libraries. And it's not approximating it, but guarantees it won't miss a case.
@lanodan Well yeah, these are the reasons why you have checkers with false positives and mostly only basic local reasoning.
From what I see, flawfinder is pattern matching well-known footguns by function name, rather than understanding these kinds of bugs semantically.
It can of course still be super useful given how common these footguns are, but it's not analyzing C deeply.
Clang analyzer does a lot of sophisticated analysis, but it is limited by the flexibility/vagueness of C's semantics.
@lanodan C needs new type system extensions or annotations to improve static analysis further.
Current tools hit dead-ends due to problems like pointer aliasing, mutable `const`, and lack of information about thread safety.
It's surreal how slowly time moves in the world of C compilers.
Today there are still active projects that are hesitant to move past C89, and C99 is still the "new" standard.
The C99 standard has been released before the first public Mac OS X and Windows XP. It's older Itanium and the x86-64 instruction set. It predates iPod, Game Cube, first ever Xbox, and Nokia 3310.
Entire platforms lived and died in the meantime, while C programmers still can't be sure if they can rely on the new C99.
@janl has homebrew for intel and runs x86 zsh?
Seriously, in retrospect, #autotools itself is a massive supply-chain security risk.
It has normalized shipping and running tens of thousands of lines of arbitrary executable code without any safeguards.
Code that is so mind-numbingly awful that nobody will review it, and written in a language that is full of gotchas that are sneaky eval gadgets.
People are afraid of running unaudited `curl | sh`, but nobody bats an eye on 24707 lines of obfuscated garbage in `./configure`.
GNU social JP is a social network, courtesy of GNU social JP管理人. It runs on GNU social, version 2.0.2-dev, available under the GNU Affero General Public License.
All GNU social JP content and data are available under the Creative Commons Attribution 3.0 license.