Google is really losing the plot, there are lots of things in my blog that it can't find but DuckDuckGo can. My blog isn't that much of an outlier so I suspect this pattern exists more widely. If you do a Google search and come up short, don’t give up, try elsewhere.
Conversation
Notices
-
Embed this notice
Tim Bray (timbray@cosocial.ca)'s status on Wednesday, 29-Nov-2023 10:06:41 JST Tim Bray -
Embed this notice
Evan Prodromou (evan@cosocial.ca)'s status on Wednesday, 29-Nov-2023 10:06:40 JST Evan Prodromou @timbray ChatGPT does pretty good.
-
Embed this notice
Evan Prodromou (evan@cosocial.ca)'s status on Wednesday, 29-Nov-2023 16:27:47 JST Evan Prodromou @tim_lavoie @bplein @timbray most search engines return mostly wrong results. Human discretion is always required.
-
Embed this notice
Tim Lavoie (tim_lavoie@cosocial.ca)'s status on Wednesday, 29-Nov-2023 16:27:48 JST Tim Lavoie @bplein @evan @timbray And it won't tell you it doesn't have that information, it will just make something up, bullshitting with utter confidence.
-
Embed this notice
Bill Plein🌶 (bplein@bvp.me)'s status on Wednesday, 29-Nov-2023 16:27:49 JST Bill Plein🌶 -
Embed this notice
Tim Lavoie (tim_lavoie@cosocial.ca)'s status on Wednesday, 29-Nov-2023 16:30:35 JST Tim Lavoie @evan @bplein @timbray True, but I can't believe that the solution to insufficient controls on SEO spam is, "add more bullshit."
-
Embed this notice
Evan Prodromou (evan@cosocial.ca)'s status on Wednesday, 29-Nov-2023 16:40:17 JST Evan Prodromou @tim_lavoie @bplein @timbray good luck!
-
Embed this notice
Evan Prodromou (evan@cosocial.ca)'s status on Thursday, 30-Nov-2023 00:03:52 JST Evan Prodromou @bplein @tim_lavoie @timbray yes, definitely. I often get wrong answers from ChatGPT. With GPT Plus, you can ask for references and links, as well as justification for each answer.
-
Embed this notice
Bill Plein🌶 (bplein@bvp.me)'s status on Thursday, 30-Nov-2023 00:03:53 JST Bill Plein🌶 @evan @tim_lavoie @timbray 100% of todays popular search engines return links to the source material.
ChatGPT returns text of its own creation, which can be true of false or a combination of both. Hence the very publicized issue with ChatGPT insisting that there were no countries in Africa that started with “K”. It saw an old meme that was making the rounds, and since it was so widely published, it became the “truth”.
Very different from a search engine.
-
Embed this notice
Evan Prodromou (evan@cosocial.ca)'s status on Thursday, 30-Nov-2023 01:32:13 JST Evan Prodromou @bplein @tim_lavoie @timbray I had a good conversation with ChatGPT about this particular problem. I hope you find it as interesting as I did.
-
Embed this notice
Tim Lavoie (tim_lavoie@cosocial.ca)'s status on Thursday, 30-Nov-2023 02:19:55 JST Tim Lavoie @evan @bplein @timbray Interesting, yes. Reassuring, not so much.
-
Embed this notice