"If someone uses an LLM as a replacement for search, and the output they get is correct, this is just by chance. Furthermore, a system that is right 95% of the time is arguably more dangerous tthan one that is right 50% of the time. People will be more likely to trust the output, and likely less able to fact check the 5%."
Conversation
Notices
-
Embed this notice
Timnit Gebru (she/her) (timnitgebru@dair-community.social)'s status on Friday, 21-Mar-2025 07:46:14 JST Timnit Gebru (she/her)
-
Embed this notice
Timnit Gebru (she/her) (timnitgebru@dair-community.social)'s status on Friday, 21-Mar-2025 07:46:15 JST Timnit Gebru (she/her)
Tapping this sign again. By @emilymbender
https://buttondown.com/maiht3k/archive/information-literacy-and-chatbots-as-search/"As OpenAI and Meta introduce LLM-driven searchbots, I'd like to once again remind people that neither LLMs nor chatbots are good technology for information access."
-
Embed this notice