How it started:Don't use Wikipedia as a source, anyone can edit it, so you don't know if you can trust it!How it's going:Just ask the hallucinating word-association soupifier, it'll be fine:kyaru_woozy:
Conversation
Notices
-
Embed this notice
liansirenia@kitsunes.club's status on Wednesday, 07-May-2025 04:28:14 JST LianSirenia
- snacks likes this.
-
Embed this notice
Fish of Rage (sun@shitposter.world)'s status on Wednesday, 07-May-2025 18:37:53 JST Fish of Rage
@LianSirenia use neither as a source, use both as a research starting point I guess -
Embed this notice
SuperDicq (superdicq@minidisc.tokyo)'s status on Wednesday, 07-May-2025 18:39:48 JST SuperDicq
@LianSirenia@kitsunes.club I don't like the word "hallucinate" in the context of machine learning. It implies that machine learning models can actually think, can make mistakes and some kind of mental process similar to a living being. This is obviously not true.
I do like "word-association soupifier", it is a bit more formal than the "bullshit-generator" that I often like to use.翠星石 likes this.