@Codeberg AI companies crawl our websites.
We ask that they stop by using the industry standard robots.txt
AI companies ignore those rules.
We start blocking the companies themselves with conventional tools like IP rules.
AI companies start working around those blocks.
We invent ways to specifically make life harder for their crawlers (stuff like Anubis).
AI companies put considerable resources into circumventing that, too.
This industry seriously needs to implode. Fast.