love the pettiness of macos showing windows computers on the lan as old-ass crt monitors displaying a blue screen of death. need more of that energy in my linux
@raccoon@pho4cexa I don't really use GUIs when it comes to listing information, but if I was to write a GUI that showed computers on a network, it would show macos as fruity toys and windows computers as water closets and of course GNU as GNU.
@raccoon@pho4cexa Devils, to show the proprietary tricks such proprietary OS's get up to (i.e. just include the "Open"BSD (a spiky pufferfish) and "Free"BSD logo (a demon)).
Wait, that's a huge amount of proprietary software installed by default in the default "src" repo (how it's uucoded doesn't make a difference, as it denies the users freedom in that state).
@munir@pho4cexa@raccoon (By default Canada Anubis is targeted primarily at possibly free browsers (browsers with "Mozilla" in the useragent), but interestingly it does not by default target the useragents of proprietary browsers that LLM scrapers would use (i.e. useragents with chrome in it for iOS, Android, macos, windows etc)).
@munir@raccoon@Zergling_man@pho4cexa They may be using browser fingerprinting, as I was not able to trigger the malware delivery with "Android" or "AppleWebKit" in the useragent with GNU wget.
@Zergling_man@munir@raccoon@pho4cexa >because scrapers will just reroll their UA whenever they notice Generally poorly programmed scrapers will just keep trying with the same useragent and blocking that useragent or the IP or serving a GNU zip bomb works.
You do indeed have not much chance against a properly programmed slow scraper without advanced heuristics.
It's fair game to punish chrome, Android and iOS and macos and windows users, but you shouldn't target "Mozilla", as GNU icecat has a useragent that contains that.
@Suiseiseki@pho4cexa@raccoon@munir No, it's retarded in any case, because scrapers will just reroll their UA whenever they notice (if I were writing a malicious scraper I would have it hold a list of the 5 most common UAs, plus a random generator, to go through to see if it can beat 400-class responses), and if it happens to coincide with normal use it's annoying. Like, say, the several sites I have come across that block lynx's UA. What the fuck scraper even uses lynx as a base anyway
Actually, I'll grant that just blocking UAs of chrome and firefox and such is a good idea, but I don't actually care if people use terrible software to access my site. I'll just pick on them for JS because it's a danger to themselves to allow it to execute on random websites.