@animeirl No, C allows people to write C code, Rust makes people turn Github issues into 300-reply hellthreads.
Rust isn't a programming language, it's a diversionary tactic. People are talking about programming languages that aren't rust and some Rust whiner shows up and either demands everyone use Rust or complains about Rust persecution. It is the neo-Nazism of programming languages. It's a perfect parallel: neo-Nazis spend nearly no time actually living out their ideals, they just whine that nobody lets them and that they're being persecuted by the Jews (C programmers) and that they are superior.
Know why I learned C? Know why I learned any of the languages I know? Do you know why people learn a programming language? None of these reasons apply to Rust: it's useless.
@p@animeirl Not how many of the supposed "Rust replacements" for common utilities like those from coreutils are shipped with some cuck license instead of GPL.
@animeirl C is unavoidable not because it has been around so long--Fortran and Lisp have been around longer--but because it is useful for so many tasks.
Rust is a shit language and I will not use it and the more people whine, the less I will be inclined. Rust is like Nazism: even if you assume that they are correct about everything, the main effect it has is turning people into intolerable whiners that cannot help injecting mentions of their single issue into every thread they see, whether it's germane or not.
@animeirl Tell you what: I'll avoid interacting with any software that was written in Rust, you can avoid interacting with any software that was written in C.
@animeirl Rust is a shit language, but that's why we have so many languages: people that do not like how C works can fuck off to Rust, and people that have sophisticated enough taste to prefer C are able to ignore Rust entirely.
@p@animeirl for real. also i am sure they do not know what debuggers are for. "wehhhh segfault wtf did i do weong !?!?!? this is why rust is better n!!!!" i hate these rustnigs
@animeirl@p@nishi honestly I feel like rust as it currently exists is something of a half measure, it doesn't go far enough on safety measures and it lulls the programmer into a false sense of security for writing "systems" code meanwhile gc'd languages are better for 99.9% of stuff anyway. I'm keeping my eye on austral for this reason and one or two others
segfaults aren't really the problem. it's when they dont happen that is but it's still better for potential issues to come up at compile time than during runtime when possible
@allison@nishi@animeirl Almost all debuggers suck, but they can all be convinced to give you a stack trace when a program crashes with a minimum amount of hassle, which is the main use for any of them.
@animeirl@nishi Lisp has been memory-safe since forty years before Rust existed, and is a much better language. Tcl is memory-safe. APL's memory-safe. OCaml's memory-safe. Lua's memory-safe. awk is memory-safe. Go is memory-safe. Ruby is memory-safe. JavaScript is memory-safe. Erlang is memory-safe. Brainfuck is memory-safe. Haskell is memory-safe. Python is memory-safe. Prolog is memory-safe.
Perl is memory-safe. PHP is memory-safe. Java is memory-safe.
@mint@animeirl That is another good point. I'm sure this has nothing to do with the tech companies hating the GPL and wanting to move as much code as they can into a license that lets them exploit open source harder than they already do.
@p@sysrq@nishi@animeirl I meant biggest as in most eyeballs and widest distributed. I thought the Blue Gene running 9 was pure research and didn't leave IBM?
@allison@animeirl@nishi@sysrq Well, the main producer of set-top boxes, effectively. (Love Brantley, cool guy.) Probably the biggest commercial use was Blue Gene, though.
@animeirl@nishi Yes, that is my point. Rust isn't a language, it's a cult. People that aren't complete assholes don't show up to Do Advocacy at people. It's as bad as assholes that go to parties just to "network" with unsuspecting people that just want to have a fun time with friends. It's as bad as salesmen and panhandlers that exploit the natural tendency to be polite instead of telling people to fuck off.
This thread had nothing to do with Rust, and some asshole showed up to Do Advocacy. "These people are talking about programming. This is a situation I can exploit for selfish ends."
@p@sysrq@nishi@animeirl Oh for sure, I just don't think anyone was running the Plan 9 Blue Gene version besides the implementers. Lack of performant Fortran compilers alone would have made it a non starter for commercial supercomputing applications
@allison@animeirl@nishi@sysrq Well, it was a DoE/LANL-funded endeavor, I don't expect the intersection of those two companies with Bell Labs and IBM yields much that can be discussed in public, but I recall it carrying workloads. Maybe ask evh or Ron Minnich.
@animeirl@sysrq@nishi In response to the first two words, yes. I'm about as sick of Rust as I am of Nazis and they have a very similar flavor. It'd be a toss-up if I could eliminate one or the other from the internet.
@allison I swear, I'm gonna make a language, it's gonna just be C without this bullshit, I'll call it "G" (a "C" wearing a moustache as a flimsy disguise).
@idiot@p@nishi@animeirl pretty common among unix users I think, the tools are crap bc nobody really cares about them so the solution is judicious use of printf + being a better gorilla
@allison@p@nishi@animeirl To this day, I haven't figured out how to use GDB et. al. Like I know how it works and what's going on, I just never know when the best time to use it is and I bash my skull against the wall doing things the ghetto way instead.
@idiot@allison@animeirl@nishi Yeah, I know how to get a stack trace, disassemble a chunk of RAM, dump the registers.
Rob Pike said that when he was debugging the blit terminal with Ken Thompson, it would crash and he'd start rifling through the stack trace or flipping through the code, and Ken would just look at the ceiling and think, and Ken usually found the bug faster. A mental model of how the program works is more useful than the debugger: the debugger can only tell you what has happened and show it to you in slow motion, it can't say why.
@allison@idiot@animeirl@nishi I don't think it's like that, I think some tools are not very useful. You can see what a process did or you can understand how the program works. This is the reason "the code doesn't resemble the compiler's output" is horrifying: you can't understand how the program works by reading the code if the compiler editorializes. So you bash stuff in at one end, the compiler does something completely different (e.g., the stuff in the top post in the thread), then at the other end you can use a debugger: this is a worse method of programming than thinking about the problem, writing code to express an algorithm, and then fixing your understanding if you fucked up the first step or fixing the code if you fucked up the second step.
@p@nishi@animeirl@idiot Oh without a doubt, I'm just saying that in addition to that, the tools themselves are needlessly baroque and thus there's no reason to use them unless you're forced to. What human being living on this earth has the time to actually *know* gdb? Much the same for any of the C debuggers shipped by any of the big names. acid is fine more or less because it doesn't do much and what it does it does well. Other languages and ecosystems don't suffer from this problem nearly as much because (a) the code is easier to reason about in the first damn place and (b) they don't skimp on tooling.
@p@sysrq@waltercool@animeirl rust was literally just a personal project gone awry, I don't think even the creator expected it to gain as much traction as it did (in this way, I see him as a tragic figure along the lines of ryan dahl)
> It helps me with code exploration when I'm reading some complex new software and wanna understand how it works.
Yeah, that is real stuff. I don't usually use the debugger to step through code like that, but I do use strace pretty heavily for similar purposes. Works even if you can't find the source!
> (that was a tip from Steve Maguire's book).
It was a really good book! They should have made all their coders read it, 95/98 might not have crashed so much.
It helps me with code exploration when I'm reading some complex new software and wanna understand how it works. Also, stepping through your own code can help uncover bugs before they happen, gives you a different point of view (that was a tip from Steve Maguire's book). Always using some gui frontend btw.
> (a) the code is easier to reason about in the first damn place and
C is easy to reason about: you can basically do the compilation with pencil and paper. There are a handful of warts around things like casting or bitfields, but it's easier to reason about the behavior of something simple than something complex. Languages that surprise you are the ones that are hard to reason about.
> (b) they don't skimp on tooling.
I think there are two approaches to this, I mean, there are languages that are all tooling (Java has a lot of tooling because it's verbose and it's missing a lot of facilities that you expect from a language that lives on a VM), and there are languages that tooling would only get in the way of. The former effectively mean that your interface to the language includes the tools, but in the latter the interface to the language is just the language. It's usually better to make the tooling unnecessary than to add the tooling, some languages manage to be both concise and expressive, or integrated well enough with the environment that generic tools work for diagnostics. You don't really need a source-level debugger for awk, your awk programs don't get long enough for that to be helpful: the problem doesn't exist so no tool needs to be made to solve it. Most application programming languages can't segfault (or at least not because of a bug in your code): the bug has been avoided so you don't need a tool to handle it. Or they've integrated a stack trace, you don't need a separate tool for that.
Like, Nethack. Reading Nethack's source code is a component of playing Nethack. It is a little unique in that sense, most games don't work that way. You push the button, Mario jumps, your interface to a Mario game is all in the controller. Maybe for some games the interface is the controller plus a guide. Not that Java is terrible because of that, just that the interface to Java isn't Java source code files, it's Java source code files and the IDE and whatnot, and XCode seems to work more or less the same way.
As kind of a personal preference or a matter of taste, I like the languages that are usable without the tooling.
@moesha@p@nishi@animeirl to paraphrase, cproc is an attempt at making an optimizing compiler for contemporary (post ansi) c using qbe as the backend that doesn't fall into the ub traps op outlined, lcc is prototypical tinycc with a book written about it, and chibicc is a pedagogical c23 compiler which does no optimization and also has a book being written about it
Cproc is a "modern" small C compiler. It's based on https://c9x.me/compile/, which aims to have "70% of the performance of industrial optimizing compilers in 10% of the code." Cproc performs basic optimizations via QBE, while maintaining relatively quick compile times. But it is not as feature complete as LCC and Chibcc, notably the preprocessor doesn't seem to be complete yet.
LCC was the basis of a book about compilers, it's more for learning purposes but it still serves as a nice compiler. Someone created a compiler, pelles C, that is based on LCC and works on Windows.
Chibcc is similar to TCC: its a non optimizing compiler, but its more modern and the code is easier to understand. Justine Tunney switched from using tinycc to chibcc in the Cosmopolitan libc project.
@animeirl@nishi I use several of them. The narrow-minded Rust programmer cannot comprehend the continuing utility of C and Forth and even Fortran and assembly and all the languages that scare you from your safe home in the Rust Suburbs.
@animeirl@sysrq@nishi I'm saying I use it every day and it's a delight and I came to this conclusion without tedious fucking assholes Doing Advocacy at me.
The opinion of a Rust programmer means as much to me as the opinion of a Nazi or a furry. It's like a Mac user telling me my UI looks ugly. It's like a vegan criticizing my eating habits. Showing up to inject Rust into a non-Rust-related discussion on the internet is the equivalent of the Jehovah's Witness banging on my door and then offering opinions when told to fuck off with their Jehovah's Witness bullshit. It's like a telemarketer offering an opinion on my call quality. hellallyourfamily.jpg
@p@nishi@animeirl@moesha qbe+cproc actually does better than 70% and in much less code than 10% (After all 10% of LLVM or GCC would still be gargantuan), so far it ends up at 0.1% I haven't went into cproc code much yet but qbe code is quite nice while I remember tcc code as being messy to read, it has roots in IOCCC after all.
> which aims to have "70% of the performance of industrial optimizing compilers in 10% of the code."
Yeah, I think it's completely doable; Dhrystone put tinycc at about half as fast as `gcc -O3`, which surprised me. I know it's an old benchmark, but still. So if no optimization gets you to 50%, I think some low-hanging optimizations could get well past 70%.
> Chibcc is similar to TCC: its a non optimizing compiler, but its more modern and the code is easier to understand. Justine Tunney switched from using tinycc to chibcc in the Cosmopolitan libc project.
I think my main use of tinycc has been using libtcc as a JIT/FFI.
Hey, he cited Rob Pike and the Ghoulom paper, I like this compiler already.
(I cannot avoid being an asshole: "Modern" isn't really a meaningful word as used nowadays; it usually just gets used as a stand-in for "newer and maybe or maybe not better but definitely newer, probably something has been colorized, but anyway the old thing is definitely terrible because it was written six months ago.")
@allison@amerika@animeirl@idiot@nishi There's always Forth. The standard way to start a Forth project is to write a Forth interpreter, making it completely impossible to have your experience with Forth ruined by unaccountable strangers.
@amerika@allison@animeirl@idiot@nishi Yeah, but see the top post, the standards body has gone off the rails, they'd rather pretend there is no machine.
> We all know open source doesn’t mean it’s more secure.
No, but "not Microsoft" does mean "more secure". You'd be foolish not to be suspicious of software produced by a company that has such a close relationship with the government of a major power, but most especially this company and this government. I do not know if you are old enough to remember watching what shook out in the 90s in this industry, but you are surely old enough to remember Microsoft on all of the PRISM slides if not but if you cannot build the project yourself from source you have downloaded, and the binaries come from Microsoft, you are owned. DRM, DMCA, TPM, TCI, all of these things were pushed by Microsoft.
@p@animeirl true but memes aside don’t you specialize in security? We all know open source doesn’t mean it’s more secure. I’m saying this as a void user.
> I agree with your point, but the fact of the matter is that free or libre software isn’t accessible to the general public
That's kind of a separate problem, but it's a problem that is possible to blame on Microsoft, who spent years trying to kill Linux specifically, but also Digital Research DOS (Microsoft's first big anti-trust lawsuit), Netscape (their second big anti-trust lawsuit, this time solved with campaign contributions rather than lawyers), the whole TPM debacle, and they are still doing their damnedest to make sure that Linux doesn't boot.
> Until we bridge those gaps we can’t expect the average person to want to maintain their own systems,
I don't expect them to, but my point wasn't how to solve that problem, it was that "Microsoft has adopted $x" is on par with "Google has adopted $x" or "Facebook has adopted $x". (I ignored Go for years because, despite Ken being involved, it was Google.) To make matters worse, Microsoft themselves have a terrible track record on security. Using IE to view a website was like yanking the lid of a needle disposal box and sticking your dick in.
> I mention void because I don’t run a systemd based distro for security reasons and minimalism
Yeah, I'm in the same camp, but talking about a distro isn't relevant, it's security-as-product rather than security-as-process.
> I’m not saying I’m not suspicious of microsoft, but they have people paid to audit and constantly patch known CVEs
If the software is a trojan, and they fix unintentional problems in that trojan, it's still a trojan. This is the company that, in terms of languages adopted, picked C++. They pushed out VB and VB has been a punchline for as long as I can remember because it's one hole after another even before VBA and email servers still get offered .xls files on a regular basis. I don't think it is wise to take any cues from a company with that kind of track record, let alone a malware vendor. Their endorsement is a negative thing.
@p@animeirl I agree with your point, but the fact of the matter is that free or libre software isn’t accessible to the general public or people that aren’t autistic like most of fedi. Until we bridge those gaps we can’t expect the average person to want to maintain their own systems, patches, and updates. People will continue to choose convenience over security or privacy.
@p@animeirl I mention void because I don’t run a systemd based distro for security reasons and minimalism
I’m not saying I’m not suspicious of microsoft, but they have people paid to audit and constantly patch known CVEs while with open source projects we rely on each other to catch these things. It creates a bystander effect that’s as misguided as corporations making millions off of open source projects.
The antitrust suit against Microsoft ended up being a disaster.
In my view, what holds FOSS back is trust issues. People trust stuff that makes it into the #M5M or is sold in big box stores, and they trust being customers who can then complain if something goes wrong.
After all, most people still buy Budweiser and McDonald's. Quality is not an issue. Freedom is not an issue. Flexibility is not really an issue. People want safety and convenience.
The best thing we could do for FOSS is get a list of solid packages for everyday use into the Wall Street Journal and not lie, e.g. do not claim that LibreOffice is good.
> The antitrust suit against Microsoft ended up being a disaster.
Which one?
> In my view, what holds FOSS back is trust issues.
This presumes that they're making a decision. You ever go to the store (back when people got in a car and went to an actual store) and overhear some boomer exasperated that they bought their kid an empty NES "and you have to buy the game?! This thing was two hundred dollars and it doesn't already have the games on it?" They all threw away their computers and got new computers because the new ones came with Windows 98. If you want to pick up adoption, you have to give them something they want and it has to come out of the box ready to go. Normies run Linux all the time, Android is the most popular Linux distro ever created. Android comes out of the box running Linux. It's not even a matter of trust, they don't know how to change their own oil, they don't like to think that it exists because it's not a thing they care about. They use the computer to send a spreadsheet to their boss, and at best, Linux lets them send a spreadsheet to their boss. So they don't have a reason to care and they have plenty of reasons to avoid thinking about it.
> People trust stuff that makes it into the #M5M or is sold in big box stores,
Yes; Microsoft incentivizes the manufactures to ship with Windows and incentivizes the retailers to ship with Windows, and they provide negative incentives to allowing anything else on the shelves. The hardware isn't going to be much cheaper: the companies already paid Microsoft to get "WHQL certified" and you can't really write that on the box.
I don't know how to make people care, that's the issue, so I try to just have a nice environment, little Plan 9 cluster, little CRUX and Slackware boxes.
> do not claim that LibreOffice is good.
It's not just good, it's great: at the cost of waiting for some unholy JVM situation to drag its fat ass off my disk and get into my RAM, I can sort of view most of the words that are in a document without having to get a Windows machine.
That stuff all happens in the browser now, though: people just use Google Dox or maybe their company sprang for Office365. That all works fine on Linux. I suspect that most of them wouldn't know the difference: they sit down, they click on Chrome, Canonical shoves a bunch of ads in their face instead of Microsoft shoving a bunch of ads in their face.
the compiler is entitled and correct to optimize that out. the target language prohibits integer overflow, therefore the compiler may assume it doesn't happen, and given this assumption, the result of the compare is known.
what is funny to me is for people to think they're allowed to exceed the speed of light to test whether they're breaking some law of physics. you can't exceed the speed of light, and traffic guards who know their physics know they don't have to check, they can safely relax, as the actual laws of physics cannot be broken ;-)
> is also really fucking dumb and nobody should do this ever.
I don't know what gives you that idea. In the absence of access to the carry flag, you check for overflow by looking at whether an addition cleared the high bit.
@tiskaan The C compiler Ken Thompson wrote, which is now the C compiler for the Plan 9 system, Inferno uses it, etc. Early versions of Go were bootstrapped using that compiler.
@p@nishi@animeirl isnt k&r outdated as fuck? its the only reason im not reading as its only talking about language c whereas sicp is about timeless concepts. im going through sicp too lmfao.
Anyway, the language hasn't really changed. A little extension for the sake of convenience, but C is C. Even code from v7 (which predates ANSI-C, which is what the second edition of K&R uses) still compiles and runs for the most part, so the book is current. Even the stdlib functions described are the same. The stuff that's changed fits on half a page: you wanna use strncmp() instead of strcmp(), the "register" and "inline" keywords are sometimes ignored in $current_year compilers, etc.
> its the only reason im not reading as its only talking about language c whereas sicp is about timeless concepts.
No, there's a reason people still read both K&R and SICP. K&R does describe the language, but the examples are all great stuff, and it's heavy on code. qsort() and binary search and parsing and how memory allocation works.
I'd recommend playing around with GCC. Clang has failed me alot where undefined behaviour in the C spec doesn't do anything reasonable in clang but does what I'd expect under GCC (or msvc is.good too)
@p what happened? I use gcc for ages (couple decades, at least) and it's fine. well, it went messy with useless warnings and paranoid output instead of just pointing to the file and line where the error is, but it's not so crucial.
@iron_bug I don't know, it's a mess; usually this kind of thing happens after a personnel change. I'm not sure what version it was but a long time back I was debugging something, I put a `printf("About to do the thing before the bug happens\n");` and gcc silently translated it to puts(), so I set a breakpoint on printf and the program never stopped, my breakpoint didn't happen (and mangled the constant in the binary).
trying to second-guess users as to their expectations is part of a compiler's job, and that's what warnings are often for. but the primary mission of a compiler is to translate the source code to object code; an optimizing compiler adds to that making the object code run fast or be compact or somesuch, still according to the meaning of the program according to the language specification. second-guessing the user with -fdo-what-I-mean :-) enabled by default is usually a recipe for trouble, because those who understand the language and take advantage of its features to get better code end up unhappy, and most users who don't know the darkest corners of the language in such amount of detail still end up unhappily surprised because incorrect expectations are just too varied. it's tough, but and it's easy to argue for "just this one case" while missing the point of all the other "just this one case" that would be just as deserving
There's often a hundred different ways you can convert C code to Assembly with exactly the same result. However, we tend to want the most optimized solution.
The hardest part of building compilers is to find the most optimized solution. So it is very understandable for compilers to simply just remove parts of the code that it deems to be unneeded considering the specifications of the language you're trying to write a compiler for.
In the case of Clang in this example, they are technically correct that skipping the if statement is valid because according to C integer overflow is considered "undefined behavior", which means anything goes, including ignoring it.
But is it really the correct choice that Clang made here from a user point of view? I don't think so, because they are doing things that the user might not expect. It's probably better if the compiler shows a warning and executes it anyway or something like that.
@p@diresock@SuperDicq compiler should not think for programmer. it compiles your code to binary. that's all. generally speaking, nobody can prevent you from shooting yourself in the foot, if you want. that's fine. nothing is prohibited, you can use overflow if this is your intention. it's not an error for compiler. I don't understand people that think that compiler should look for their bugs, by some weird reason.
@p this usually doesn't happen if you don't use optimization, etc. I write in C for over 30 years and I don't remember such problems in debugging. -g (-ggdb3) -O0 and that's it.
> I write in C for over 30 years and I don't remember such problems in debugging.
Yeah, it's a new problem, the compilers are kind of going off the rails now. I expect optimization to move things around, I don't expect it to decide to call different functions. the_compiler_has_gone_mad.png
> However, we tend to want the most optimized solution.
"I can give you the answer as fast as you want, as long as it doesn't have to be correct."
I mean, everyone wants their code to run fast, but they want it to be their code. You don't want something the compiler hallucinated that may or may not resemble what you wrote. Since the 70s, one of the points stressed about C was that you could tell what it was going to do: no fuzziness, you can basically see the instructions the compiler will emit. So gcc emits some SIMD stuff when it spots your for loop adding two arrays together, that's fine, it's emitting code that behaves the same as what you wrote, but changing printf() to puts() or optimizing away an infinite loop is way beyond the pale.
You end up coming across a lot of "No, goddammit, do what I wrote" directives in code. `___this_is_actually_used___`, people introducing side effects to fool the compiler into emitting code that executes their loops, a sea of #defines and #ifdefs and underscore-rich symbols. "No, actually inline this, I really mean it this time, this function is only called once and there's no reason not to inline it." 4kB of `-fstop-doing-retarded-shit` passed to the compiler. (I made up the option name but try compiling qt4 some time if you want to see a compiler invocation that is longer than the text of the file it's compiling.)
> they are technically correct that skipping the if statement is valid because according to C integer overflow is considered "undefined behavior",
This is not remotely the case. "The spec says that it won't say what to do about this" does not mean "It's safe to ignore" and it's no excuse to write a bad compiler.
> It's probably better if the compiler shows a warning and executes it anyway or something like that.
A warning would be fine. "If someone digs a signed magnitude machine out of a museum or maybe someone writes a C compiler for old-timey analog computers, this code probably won't work the way you expect." I mean, "x+100" even handles one's complement.
@p@diresock@SuperDicq >There's often a hundred different ways you can convert C code to Assembly with exactly the same result. However, we tend to want the most optimized solution. yes, and I wrote software and microcode in assemblers sometimes or used assembly parts in C/C++ code for optimization. assembler is fine if you know what you do. I really don't need "more optimized solution". and I doubt it's needed at all. I started to write in C when I was 12 and it was not a problem for 12 years old to cope with plain Borland Turbo C compiler that was quite straight and simple. it's a programmer who thinks what he writes, not a compiler. overflow is not an undefined behavior, actually. it's exact on each architecture and one can check it with assemblers. and clang is coprorate BS, imho. I never use it and don't recommend it to anybody who wants to write in C. compiler should not "expect" anything from code. it should not change code in any way. it should comply to standards and that's all. and automatic optimizing of code is very slippery slope. I had seen many errors in compilers that referred to optimization. since, I'm very cautions toward optimization options, especially on microcontrollers, etc. btw, sometimes assembly inlines are the way to bypass the bugs of a compiler.
> overflow is not an undefined behavior, actually. it's exact on each architecture and one can check it with assemblers.
That's the problem with the spec. So, the spec started with "This is undefined by the language, it might do different things on different machines", then it progressed to "You can't do this because it isn't defined", and the compilers started silently rewriting it.
> clang is coprorate BS, imho.
Yeah, engineered entirely to help Apple get around gcc's GPL requirements.
> don't recommend it to anybody who wants to write in C.
I don't even want to touch gcc any more, just kencc and tinycc.
@p@diresock@SuperDicq I think it's better to avoid writing stupid things to guess not whom to blame afterall. the only one whom a programmer can blame for all bugs is he himself. compiler is just a tool. but tools may be buggy and/or clumsy, so I don't recommend clang. and the article above is just another example why it shouldn't be used. there're many other points against it too.
> In this case wouldn't it be more fair to blame the C standard instead of Clang?
If clang does something stupid because the spec says it can, then clang has still done something stupid. If the rules are stupid, the rules are stupid; if you do something stupid because the rules are stupid, you've still done something stupid.
@p I agree that compilers went a full way out of normality nowadays. and somebody must stop that mad printer of standards, for goodness. we don't need so many standards per second, for sure. this makes things worse and litters the language definitions and standard libraries with absolute nonsense. I was ok with C++ until it was C++93 or at least 98. but then it went unleashed and turned into a ridiculous set of useless junk just because they can add whatever bullshit to standards. and I began to hate C++ and turned back to writing in pure C because it was not so fucked up and quirky. well, at least, there's no such madness as compilers incompatibility and the diarrhoea or numerous standards.
@p I build my system myself, from sources and I'm really afraid of updating a compiler because I cannot predict what monkey shit they could add to the next version that can ruin normal working software and introduce new bugs and quirks, or add new tons of annoying useless output. I seriously begin to think that I possibly need to patch gcc to remove all that litter from output because it drives me mad.
@iron_bug Yeah, I run into more portability issues with different compiler versions than I run into on different architectures. LP64 versus LLP64, endianness issues, these are fine, they don't cause problems if you pay attention to what you're doing. The difference is if I make a mistake with something like endianness, I feel foolish, but if the compiler rewrites my code, my program is broken because of someone else's mistake, this is frustrating.
@pernia@animeirl@mint If it costs more to do it, they're less likely to do it. MINIX has a TCP/IP stack, this is a non-trivial undertaking, it has drivers for hardware, it has a C compiler that targets it, they did these things because they could but if you have to attach dollars and man-hours to features, then some features don't get added.
@p@pernia@mint@animeirl there are enough permissively licensed and proprietary COTS operating systems in the world that not having minix wouldn't have been a significant impediment to intel for what they were trying to do, iirc previous versions of ime were even based on threadx (not to detract from your broader point but as far as I'm concerned the genie is already out of the bottle and you're not going to materially impact the situation in a real way just by licensing your own code permissively)
@pernia@meowski@p@mint@animeirl (A)GPL can be used as a blunt weapon to prevent corporations from expropriating your work but in practice it has limitations and if you aren't aware of those limitations you're going to be completely screwed.
@pernia@meowski@p@mint@animeirl for commodity end user applications, the fact that they're essentially interchangeable and that you have to contend with what already exists. the more specialized the thing you're trying to do, the more you have leverage to set terms of engagement.
the only point i see in gpl is for like, rewriting windows stuff for compatibility. wouldn't want microsoft catching up on their own bugs to keep churning out trash
> there are enough permissively licensed and proprietary COTS operating systems
In this case, sure; there are non-GPL OSs, but if there were not, then they'd have to ship the code to the IME or leave features out of it. It's an example, the point is that open source is exploited by malicious actors and GPL/AGPL help prevent this from occurring.
> you're not going to materially impact the situation in a real way just by licensing your own code permissively
Sure, individual efforts are individual efforts, but a group of any size is individuals. Even if behavior only mattered in aggregate, aggregate behavior is the behavior of a large number of individuals. In the case of software, because the nature of computers and networks is to multiply individual efforts, it's not the same thing: Linux was kicked off by one guy, the GNU project was started by one guy, things would be very different if neither of those guys existed, things would have been very different if Linux had adopted the same license as BSD. But if you consider the case of a boycott, enough individuals have to decide to stick with the boycott for it to matter: if you want a boycott to succeed, "My individual efforts don't matter, just skip the boycott" leads to an ineffectual boycott.
Almost all the fedi code is AGPL, so Facebook had to spend time and money on writing their own ActivityPub server, and their efforts flopped. (In part because it was a shot at Twitter rather than a real effort.) Fedi is, in general, unattractive to that kind of entity because of the openness.
@pernia@p@mint@animeirl depending on what you're trying to do, yeah. pretty much anything I make would probably be permissively licensed but for games agpl is very useful since the anti tivoization and expansive source release clauses mean that I can nurse my grudge against exa arcadia and deny them the ability to use my work without consequences that would be totally unacceptable to them
@pernia@allison@animeirl@mint It's a matter of supply and demand. Linux is Linux: it matters that Linux is GPL'd because it's probably the most widely distributed kernel on earth. If it hadn't been when it was a hobby project in 1991, then this kernel with all of this development effort in it would be exploited. If Linus had changed licenses, there would be corpo-Linux, a fork before the licensing change, and there'd be free-Linux, and we'd all probably be using corpo-Linux, the free one would have died.
What you are trying to do and what ends up happening are not necessarily connected. Ten thousand projects kick off, maybe a hundred of them end up relevant to a niche, maybe one of them ends up relevant to a large number of programmers. Nobody can predict which ones take off, and licensing doesn't seem to affect it much.
If you're the sole author of the work and it starts off as GPL or AGPL, it's easy to say "All right, dual-licensing is now available" or release it into the public domain, but you can't go the other way, from BSD-licensing to dual-licensing under GPL and expect to be able to enforce the GPL: it will get forked. If it turns out not to matter, you can relicense, you can dump it into the public domain, whatever, no need to hang onto anything.
So it's prudent to just license under GPL or AGPL by default. You can always *start* permitting exploitation of something you've created, but you can't *stop* it once there's a release (i.e., a published work) that permits it.
@mia@animeirl@mint@pernia Then the easiest way is to just do what those guys did: develop a rootkit and leave crash reporting enabled. It's a little trickier to get it installed for them, but Azure demonstrated it's entirely possible.
> if you think a liscence is gonna stop anyone from doing malicious things with your code,
No, you have to read what I wrote and you have to know how big corporations work.
> or even respecting the terms under which they got the that code,
Businesses don't wanna get sued, they play it safe. It's not going to stop anyone, but at a big company, big enough where you've got corporate counsel, you're not going to have blatant licensing violations. They don't want the suit because they don't want the liability, but not just that: they don't want to *win* the suit, because it sets a precedent that erodes licensing. So as to prevent that, they will have an internal list of acceptable licenses and most companies will fire anyone they catch violating licenses: "Oh, you got your job done a week faster? We shipped a million units, you have fucked us." They'll scramble to patch it out if they can and they will cover it up but they will go out of their way to avoid fucking it up.
> you're relying on the people enforcing the liscence to respect it, in the case of IME.
They picked MINIX3 because of its permissive licensing. That is how it works at places like that. Lawyers don't give a shit about Linux versus MINIX3 or BSD or whatever, they don't know the difference.
@p@allison@mint@animeirl if you think a liscence is gonna stop anyone from doing malicious things with your code, or even respecting the terms under which they got the that code, you're wrong. on top of that, you're relying on the people enforcing the liscence to respect it, in the case of IME.
@p@allison@mint@animeirl "if everyone did things x way, we would win and the world would be better"
i disagree. you're better off not putting up obstacles to people that want to collaborate (try contributing gpl'ed code to MIT'd or ISC'd projects) than chipping in to a pipedream
...which is sort of a nonsense measurement anyway since you already have programmers on staff, and they will not always be engaged in paying projects. Managers like to play this game because it lets them bill clients for having staff.
> you already have programmers on staff, and they will not always be engaged in paying projects.
By that logic, you're always going to be spending your time doing something, may as well be scrubbing the floors at a dive bar for minimum wage. You've got the staff you've got, and if something is done ahead of time and under budget, that's the staff can be put to something else.
> Managers like to play this game because it lets them bill clients for having staff.
Managers want to enlarge their fiefdoms, not loan their guys out to a project that some other manager gets credit for "spearheading".
> however, lisencing is an inneffectual tool outside of your anglohegemonic business realm.
"You can wear a seatbelt if you want. It works. However, seatbelts don't protect you from food poisoning or bears."
> try suing a chinese company weaponizing your code.
I know. You can argue until you are blue in the face that nothing matters because every effort anyone makes to ever do anything is doomed and the CIA will eat everyone's babies and Tencent will rip you off, but that's got nothing to do with anything and is stupid.
> i'm telling you it won't work everywhere
It'll work here if they want to sell their products to the US.
> Ok. lets say the CIA decides to use linux to spy on communists or russian-sympathizers or white-supremacists.
"Pffff. Locking your screen when you leave your desk. You know that locking your screen doesn't protect web applications from SQL injections, don't you? It's pointless to solve a problem unless the solution to the problem solves *every* problem that exists." Sloppy thinking, lazy, fucking retarded.
(The CIA doesn't do that anyway, they fund a cutout.)
> if they can safely get away with stealing your code and wiping their ass with your lisence,
> contributing to projects with permissive/public domain lisencing has a higher value to me than grudging against big corpos.
Then go do it. I'm not "grudging against big corpos", that's stupid. If you leave candy out for kids on Halloween, and some stoned asshole wanders through, carries the bowl off, eats half the candy and then sells the other half, you're going to be annoyed: you had a reason to leave the candy out or you wouldn't have bothered. Then you ask him if you can have a piece and he says "Fuck you, the Business Software Alliance is gonna raid your office now."
I would like not to spend my effort on software that subsequently gets exploited by people that would sue me for piracy. Licensing is a matter of copying one text file or another to your repo, so you can pick one that makes it easier for someone to repackage it and sell it back to you and then sue you for piracy or you can pick one that makes it harder to put into a black box. Then you're telling me "Hey, let's all suck Bill Gates's dick because there is no perfect solution to all of our problems and if you don't suck that dick with me then you are holding some sort of grudge and you copied the wrong text file, chud. If it's not a perfect solution to every problem, it's not even worth doing. After we suck this dick, let's lay down and die in a hole."
So I've sent patches upstream, stuff I use, I've contributed to stuff, all kinds of licenses, I've put code in the public domain. I'd rather more stuff be GPL/AGPL than BSD/MIT/proprietary, because it limits choices for people that want to ship black boxes, and because I prefer other people do that, I will be doing that myself.
@p@allison@mint@animeirl hey, i can respect standing up for your beliefs and in collective effort. it works. however, lisencing is an inneffectual tool outside of your anglohegemonic business realm.
>Businesses don't wanna get sued, they play it safe. try suing a chinese company weaponizing your code. or the russian government. or any individual person in a country or region out of reach by the tentacles of the US/western copyright system
if your scope is just US big business, i won't knock it, but i'm telling you it won't work everywhere
>They picked MINIX3 because of its permissive licensing. That is how it works at places like that. Ok. lets say the CIA decides to use linux to spy on communists or russian-sympathizers or white-supremacists. if the State respect's its own law, then the CIA will publish the source code. you are trusting the state to respect its own law, however, and they don't many times. if they can safely get away with stealing your code and wiping their ass with your lisence, why wouldn't they?
contributing to projects with permissive/public domain lisencing has a higher value to me than grudging against big corpos.