So self host your websites rather than relying on some shady 3rd party like Cuckflare or DDoS-FART, consider switching from the Wayback Machine to Archive.ph or Hozon Site (or host your own archiver, as Hozon Site is FOSS), get off centralized soycial media and host your own Fedi instances, get off centralized chat soyvices hand host your own XMPP server, get off centralized email soyvices (GayMail, ProtonFail, Outlook etc) and host your own mail server, get off proprietary operating soystems and use Linux or BSD instead, get off fiat currencies and use Monero instead, get off the clearnet and use Tor and I2P instead.
@ryo I completely agree but there are 2 caveats I would point to that currently make this unsustainable for most people; monetary constraints and knowledge constraints. I think people should look and see where these holes are and patch them as they can but its easy to get down trodden not being culpable to do this and its not something u can fix with any other solution than time. If u pit the work in, ull get the results out but a lot of this stuff is so complex its not reasonable to think most people can legitimately partake in them. Pick your battle and incest yourself where u can this will be the most valuable use of your time allocation
@Soy_Magnus > monetary constraints and knowledge constraints
As for the monetary constraints, this is unfortunately the price we have to pay for freedom nowadays.
As for the knowledge constraints, online communities used to be run by people who knew what they were doing, and it worked better than the system we have now that even a useless brainlet can just set up and run a community, so I don't see that one as an issue at all.
@applejack Broadband internet always has unlimited bandwidth, and the rest of the DDoS problems can be fixed by switching to lighterweight software.
And yes, darknet results in fewer traffic, but the traffic comes from people who actually know how to find your community. Alternative is to have an invite-only community on the darknet, and a more milk toast version on the clearnet, then invite all the chads into the darknet version without ever mentioning it openly on the clearnet version. Then just keep the soyboys on the clearnet version, and ban them if they go too woke.
@ryo Josh does self-host. He owns his own hardware, he even runs a VPS service with it. I forget the details but he went to crazy lengths to get his network set up
One man cannot afford enough bandwidth to run his own DDoS filter and anything other than clearweb means you get no traffic, though he did play cat and mouse himself with some attack vectors over the years too
@Soy_Magnus@ryo I just started setting up some home lab boxes. Knowledge is a big issue, and I've written a lot of self hosting guides, but they are .. complex. With my new box, I plan on experimenting a lot with nix and hopefully writing better guides.
I want to host internally facing things like Invidious, Nitter, Wikiless, Searx .. and point my Privacy Redirect browser extension to those local instances. That way I don't get the lag and overloading found on public instances, while getting some of the benefits (it still goes to my IP I know, but I could always shunt that traffic to a remove VM or Tor if I wanted).
Hey Ryo, what's Hozon? You said it's open source, but I can't find their site/git/src repo .. must be searching for the wrong thing. I host my own Linkding and it use to support auto-archiving, but it was disabled due to overloading of public archives. It would be neat to submit a patch so it could use a local self-hosted archiver.
@djsumdog@ryo that's very valiant of u @p is nice enough to send me stuff to explain intricacies when I'm not sure about stuff and a lot of other people do too but it really does HAVE to be in depth if u want to reach a full filling guide. Coding seems to have so many moving parts that u have to be super in depth otherwise its useless. Not naming off one function or some small part of the process would cripple someone who wasn't tech literate on a grand scale, but for some one who is its laughable. U know its just a couple button taps but the learning curve seems supper steep at the start and then it just falls straight off a cliff and nose dives. U just have to hit the point of titration and that's the 3 years I'm assuming one needs minimum.
@p@djsumdog@Soy_Magnus Learning curves can be super steep for everything depending on your pre-existing knowledge. But once you get over it, things start going really quick, yea. Except if it's JavaScript, that curve is forever steep.
@udon True, it's not the idea solution, but we were talking no censorship. Although it all comes down to who's behind archive.ph and how easy it is to make them bend the knee to bullshit. I'd rather recommend hozon.site instead because at least I know for sure that loli frog won't bend over at all, and she's vocally anti-censorship too.
However, it's still beta software and doesn't archive everything. It can't archive anything that depends on JS, but that's by design, but I was also unable to archive Cuckflare's websoyte, and after investigation, it turned out that Cuckflare outright prohibits WGET from ever succeeding, which is ultragay.
@ryo I think archive.ph is censorious enough, blocking tor from even reading, and this is a certainty.
hozon.site doesn't archive css too (not archived as local resource), which doesn't depend on JS, and I can still see the annoying double slashes.
Yes, I think they use several fingerprinting methods, those I can think of are: IP, HTTP header, TLS fingerprinting. For example I can bypass Clownflare's blocking by matching Tor Browser's image headers and UA, on newer Firefox-based browser. For TLS fingerprinting, maybe this can help but I have trouble building it: https://github.com/lwthiker/curl-impersonate
Looking at the soyce code, the Bee is using full URLs for ASSets, while mine uses relative links, which I suspect to be the reason why. So in theory if full paths get replaced by relative links during the archiving process, this would fix the issue.