Same with C's death, of course. God what I wouldn't give for C to be well and truly dead, but every Professor, PL engineer, etc. all high on the supply kept trying to sell me on the fact that C (and C++) was gonna die, except it never did lmaoooo.
While Fortran has long been the butt of the joke in IT departments the world over, in a curious twist of fate, it has seen a dramatic resurgence over the last few years. …
News of Fortran's death were greatly exaggerated, etc. etc.
For example, the C & C++ ecosystems never managed to standardise any sort of build & distribution mechanism, and since Python is wrapping around basically any flavour of C & C++ code, the problems in Python packaging are a superset of what's already a hugely pressing issue in C/C++.
God yes. Finally, somebody understands why a lot of us keep going down to the barebones levels to get things done. You can't just Rust/Java/etc. you're way out of the fundamental computing problems, buckaroo, you've got to face the music eventually!!!
@mirabilos@dalias@nabijaczleweli Oh, no, we can't change the return value of a function like printf or otherwise. I forgot about the return value of the printf family of functions.
Well, okay; guess I just gotta throw this paper out and wait until we have a not-crap fmt facilitiy; it's not worth writing a new form of the existing like 16+ formatting functions for specifically this. Might as well wait until a better, more type-safe formatting facility.
@tthbaltazar I (occasionally) work on large systems where we handle data that easily overflows a 4 byte integer. I would like to not accidentally have an overflow vulnerability in my printing routines.
Hey, when I publish the paper for %z*.s everyone on the Committee is gonna bother me about not having it implemented anywhere. Do you know any C libraries where they're not stiff-neck about extensions of this nature?
We keep calling ourselves software engineers, but engineers elsewhere advance their industry by analyzing failures and building up tools to stop those and make them standard industry practice!
But we'll just have the same 6 problems, on a regular spin cycle, for like 40 years.
Meanwhile, we'll be writing about how we need to have "high impact libraries that help lots of users" and then give examples like CLI Parsing/JSON Parsing before we sit down and go "we should have some standard library types / functions for integers...?".
I need to write not only a paper, but a full-blown series on the history of C. The historical revisionism and prescription of some magical properties to C89 or whatever without mentioning the vast landscape of extensions is.
Like. So many people shit on VLAs for being too underspecified and undefined (correct), but then conflate that with it not seeing "greater uptake" and like. No? That's not true? Just talk to anyone working in HPC, it's the fucking bread and butter.
You Are Not The Only Programmer On Earth. Your Domain Is Not The Only Domain That Exists.
You can't say "C is used literally everywhere!" and then put on your blinders when people from those communities bring their extensions from their domains to the wider C programming language, to try and get better support both inside and outside of their domain! That's LITERALLY how the charter works! VLAs were WIDELY implemented when they were standardized, they were WIDELY used when they were standardized, it's ahistorical bullshit to pretend it isn't or wasn't!
They have CLEAR issues with how underspecified and undefined they can be, how much they rely on your implementer to be a literal Divine Hero to get it right, but you cannot be serious that they're (a) not used and (b) not useful. Even _Generic was introduced because of C, not other languages! It was literally HAND-DESIGNED, BY IMPLEMENTERS (not the Committee!!!) to get the job done! Every single time we have this fucking conversation I lose my mind because the actual history is right there, in the paper trail, it's WAY more transparent than WG21/C++, because meeting minutes are ALSO public record! You don't have to just assume or make shit up, it's RIGHT THERE.
Got Stun Locked Again, so it's time to dump a rant I made elsewhere:
C is not a language for direct control of the hardware. C is a langauge that is coded to the semantics of an Abstract Machine specified in a document. It's no more capable of hardware control that e.g. Rust or Zig. What it has is a wide variety of pre-existing implementations that allow you to touch that hardware, but most of that control was programmed in an assembly language or worked into the hardware/firmware by somebody.
C is not more suitable for hardware and your computer is not a PDP-11. (This is part of why "Nobody writes ISO C" is a thing. C, the language K&R made, the language that got standardized, at any point in its lifetime, was never good enough for a kernel. It just let people coordinate stuff in the cheapest way possible and accepted extensions.)
The part that's extra nutso is that the last 20 years were people who held this exact belief -- that C was just a thin layer over the hardware -- get bodied, over and over again. Compiler vendors gave them the big middle finger every time they said "wait, no, UB is for hardware!", and compiler vendors traded in UB for (sometimes negligible) speed ups.
To still believe C is "for the hardware" in today's day and age where GCC will literally run your for loop for eternity because you tried to access an array out-of-bounds and it found out about it, or Clang will solve fermat's last theorem due to a loop, is magic shroom thinking. You can access the hardware just as good with Java by using the Pointer class, you can hit the same register that the shitty ISA Manual fucking lied about by writing the same integer address into the Pointer class and dumping out a 2-byte integer to the right place.
C is not magic and you're not improving its design by insisting it is, for the love of God start Evaluating Your Tools Properly. Yer an engineer, not a fucking wizard, Harry.