Conversation
Notices
-
Embed this notice
翠星石 (suiseiseki@freesoftwareextremist.com)'s status on Saturday, 16-Aug-2025 02:29:15 JST
翠星石
@Codeberg @gturri >Calling our usage of anubis an attack on our users is far-fetched.
Subjecting the users to software that the users don't control (remote JavaScript) is always an attack.
The recent refresh-challenge is fine, but you don't need Anubis to do that.
Another user-respecting option it to set temporary cookies for files on the site - poorly programmed scrapers won't include those cookies on subsequent requests.
Yes, poorly programmed scrapers just do the operation and then continue scraping.
Decently programmed scrapers just change their useragent and continue scraping at a lesser rate unimpeded by Anubis.-
Embed this notice
Codeberg (codeberg@social.anoxinon.de)'s status on Saturday, 16-Aug-2025 02:29:17 JST
Codeberg
@gturri Anubis sends a challenge. The browser needs to compute the answer with "heavy" work. The server then has "light" work and verifies the challenge.
As far as we can tell, the crawlers actually do the computation and send the correct response. ~f
-
Embed this notice