@airwhale yes, a way to enforce this would be to give an error message, like 403 Forbidden, if the client has an unpermitted user agent. I think the point of robots.txt is to inform well-behaved software not to waste resources requesting something it may not read nor post to.