@wolf480pl@Pixdigit@eskuero@alice@lea I think it's more on the sourcehut side of things, where some endpoints can be horribly expensive, and sure you should use something like varnish or whatever to get caching in place but it becomes quite hard for large *and* popular sites.
@wolf480pl@eskuero@alice@lea oh it is unreasonable. My puny two core 4 gig webserver handles hundreds of static website requests just fine. Of course if you put megabytes of ads on your website that is gonna be an issue. Also noteworthy how literally no other website has that issue. Even small hobbyists sites and blogs don't have that problem. In best case it is just embarrassing, in the worst case gross negligence.
If this was facebook or twitrer complaining that the're getting too much traffic - which should be piece of cake for tech giants like them - I'd get it.
But it's just some online newspaper politely asking to not accidental-DDoS it... that's as reasonable a request as it gets.
@alice@lea wait what? Are you saying this article is wrong and the website deserves to get hammered by even more link preview requests as a punishment?
@lanodan@Pixdigit@wolf480pl And also being a good web citizen goes both ways. While it's true that they should try to optimize their blog, configuring caches and static content where possible, the originating side should also take some responsibility and try to minimize impact.