One thing that's not possible with robots.txt is filtering by robot *type* rather than robot *identity* (via User-Agent). So, I can block `Googlebot-Image` to keep that one bot from spidering my site, but I can't specify a class of entities like "all image search indexers" or "all search indexers". I don't know if this was ever considered, but I think it'd be interesting to know why it doesn't exist.