If I ask a search engine how old the earth is, and I get a link to Answers in Genesis saying "6,000 years," then I've learned a true and correct fact: namely, that AiG claims to believe that the Earth is 6,000 years old.
That's in essence the promise that search engines make when you ask factual questions, that they will deliver you relevant evidence to support "x claims to believe y." Thus, I always learn something from searching even if it's of limited utility.
If I ask an LLM how old the Earth is and get back 6,000 years, I've learned jack shit. In that hypothetical, it just spat back nonsense without providing me any information at all. That also means that I don't learn anything when an LLM is correct, either — there's no distinction I can use to separate those two cases, making LLMs useless to any sort of epistemology.