Embed Notice
HTML Code
Corresponding Notice
- Embed this notice@crunklord420 @Moon @graf @lain @mint
> Pleroma is I/O bound partially because it was written in a language specifically designed to be incredibly slow and wasteful when it comes to reuse of memory.
This is self-contradictory.
It's I/O-bound because it's supposed to be I/O-bound. It is network software: if you have to have a faster CPU to saturate the pipe, you have fucked up. Data enters the pipe, data goes down another pipe, and if you have to do enough work that the flow is uneven, you are doing too much work. It's I/O-bound because it's architected correctly and written well. This has literally nothing to do with the language runtime. If anything, for the amount of string-mangling it has to do, it's impressively efficient for a program written in a functional language.
> Functional programming languages are totally orthogonal to how computers actually work and you can never take advantage of the properties of a computer if you view how a computer actually operates as a flaw that requires a rube goldberg machine to pretend doesn't exist.
I don't know who you're addressing. SQL doesn't match how a computer works, either, but because Postgres spends most of its time in iowait, SQL is fine. An anime girl doesn't match how a computer works, but somehow, JPEG decoding is never the bottleneck. awk is not how computers work, either, but a one-liner takes 30 seconds to write and will usually finish executing in less time than your Rust compiler takes to build a program that runs slower. You're gonna have tradeoffs anywhere, but only a complete HN-style idiot is capable of saying things like you have. People that are this wrong are usually not as loud. You'd think you'd have looked at K by now; APL is a functional branch and garbage collected and you'll have a hard time beating K in its domain.
And which computer, anyway? Forth is way too hard on the memory bus to perform well on an amd64 system, but it screams on an AVR, PIC, anything with a builtin stack.
Every single environment has tradeoffs in its runtime characteristics. (Make a goddamn compiler and look at how many decisions you have to make.) It's entirely possible to botch it so hard that there's nothing a given design does well, but an entire paradigm? A paradigm doesn't survive past the first paper these stupid meme positions mean that you're unable to think it through or evaluate anything. You end up the equivalent of the 50-year-old Java dude, but for Rust.