Embed this noticeRusty Crab (rustycrab@clubcyberia.co)'s status on Wednesday, 22-Nov-2023 01:07:26 JST
Rusty CrabAs a research tool, I've discovered that you should be viewing LLMs less like an oracle and more like a metal detector. You're in a vast ocean of garbage that is the internet and LLMs are not going to be able to tell you exactly what to do or exactly where to dig to find your perfect artifact. They'll get you really close to it, though, and that beats the hell out of excavating the entire landscape for weeks.
@fantasia@FrailLeaf ddg is horrific. I've done a lot of cross referencing tests and google passes most of them, kagi passes most of them at a higher rate, brave search passes most of them (slightly less than google) and duckduckgo just eats shit. If you're looking to get away from google for free you should check brave search.
I hear people have good results with bing but its interface makes me want to shoot myself
@FrailLeaf operators don't really work on google anymore (eg: quotes) unless you're talking about something as basic as filtering sites.
And yes, all LLMs can do is gather consensus from internet idiots. If you have access to a real expert, that's always better.
As for what model I am using, Kagi offers a few services. There are gpt 3-4 proxies as well as Claude. They have a service called "quick answer" which is stunningly accurate and I've gotten to where I just click it by default instead of sifting through results. I don't know what model that is but it blows everything else out of the water.
thing is, user reviews are just as good as the user himself. This particular example shows you just how bad it can be. I would always suggest approaching people who have sufficient knowledge with the tool over only just depending on user reviews.
That said, I didn't ask you what LLM you were using, is it a local model that keeps updating or is it that openAI mdoels?
Yeah google results are going to be flooded with unrelated topics, but then you can use operators to filter results. image.png image.png
@FrailLeaf I run into multiple examples per day where queries give wildly unrelated results to what I searched for on both google and kagi. Consulting an LLM via kagi almost always gives me alternative links that end up being useful to the topic. If you use modern google for anything you should be well aware that many queries return total nonsense that wasn't even related to what you asked as the top results.
As for your example: that's why you look up the user ratings first.
@FrailLeaf even when it lies it can often give you keywords to search for that you had no idea about. You just should never take its instructions without verifying first.
@RustyCrab do you have an example conversation? I understand this, but imagine that you asked for "recommend best books for learning programming" and then it gave you top 5 worst books ever, and you wouldn't know until you actually start programming and realise just how mediocre the books were