Writing LLM prompts and you include the string "do not hallucinate" to fix your issue.
:hackerman:
Conversation
Notices
-
Embed this notice
SuperDicq (superdicq@minidisc.tokyo)'s status on Monday, 27-Jan-2025 21:11:50 JST SuperDicq
- kaia likes this.
-
Embed this notice
SuperDicq (superdicq@minidisc.tokyo)'s status on Monday, 27-Jan-2025 21:31:02 JST SuperDicq
@feliks@chaos.social >Calling it "prompt engineering"
:hackerman: -
Embed this notice
feliks (feliks@chaos.social)'s status on Monday, 27-Jan-2025 21:31:04 JST feliks
@SuperDicq "please do not hallucinate. it's a matter of life and death. i'll tip you $200". peak prompt engineering
-
Embed this notice
SuperDicq (superdicq@minidisc.tokyo)'s status on Monday, 27-Jan-2025 21:37:11 JST SuperDicq
@feliks@chaos.social Querying
-
Embed this notice
feliks (feliks@chaos.social)'s status on Monday, 27-Jan-2025 21:37:13 JST feliks
@SuperDicq what do you call it?
-
Embed this notice
SuperDicq (superdicq@minidisc.tokyo)'s status on Monday, 27-Jan-2025 22:21:46 JST SuperDicq
@feliks@chaos.social @bitals@fediverse.bitals.xyz Just because it might be useful does not mean it can't also be inefficient at the same time.
-
Embed this notice
feliks (feliks@chaos.social)'s status on Monday, 27-Jan-2025 22:21:47 JST feliks
@bitals @SuperDicq you might want to reevaluate if you don't get any value of these new methods yet
-
Embed this notice
Bitals (bitals@fediverse.bitals.xyz)'s status on Monday, 27-Jan-2025 22:21:49 JST Bitals
@SuperDicq @feliks
It's "talking to a parrot" actually. Techbros just overclocked it, losing 95% of efficiency on the way. -
Embed this notice
SuperDicq (superdicq@minidisc.tokyo)'s status on Monday, 27-Jan-2025 23:11:34 JST SuperDicq
@bitals@fediverse.bitals.xyz @feliks@chaos.social I'd rather not compare LLMs to parrots.
Parrots are actually really intelligent animals who can learn and understand stuff, unlike LLMs.翠星石 likes this. -
Embed this notice
Bitals (bitals@fediverse.bitals.xyz)'s status on Monday, 27-Jan-2025 23:11:36 JST Bitals
@feliks @SuperDicq
As a small language model natively speaking a non-roman language, I don't.
I haven't encountered the word "semantics" much, so my answer will be straight from DuckDuckGo. -
Embed this notice
feliks (feliks@chaos.social)'s status on Monday, 27-Jan-2025 23:11:37 JST feliks
@bitals @SuperDicq if you hold this perspective i have a question: how do you distinguish between syntax and semantics?
-
Embed this notice
Bitals (bitals@fediverse.bitals.xyz)'s status on Monday, 27-Jan-2025 23:11:38 JST Bitals
@feliks @SuperDicq
It's pretty useless to disagree with the way the technology actually works, but you do you. -
Embed this notice
feliks (feliks@chaos.social)'s status on Monday, 27-Jan-2025 23:11:39 JST feliks
@bitals @SuperDicq i disagree. but i don't think we can resolve this using text. even though much of these modern tools are poorly understood i do think that it's worth looking into the concept of emergence
-
Embed this notice
Bitals (bitals@fediverse.bitals.xyz)'s status on Monday, 27-Jan-2025 23:11:41 JST Bitals
@feliks @SuperDicq
It is fatual though. LLMs just tokenize your words and spew out the tokens most often seen together in the training data, with some specific actions like googling strapped to some tokens. It's literally the same thing as a parrot. -
Embed this notice
feliks (feliks@chaos.social)'s status on Monday, 27-Jan-2025 23:11:42 JST feliks
@SuperDicq @bitals true. i just wanted to address the counterfactual "talking to a parrot"
-
Embed this notice
SuperDicq (superdicq@minidisc.tokyo)'s status on Monday, 27-Jan-2025 23:13:38 JST SuperDicq
-
Embed this notice
SuperDicq (superdicq@minidisc.tokyo)'s status on Monday, 27-Jan-2025 23:16:51 JST SuperDicq
@bitals@fediverse.bitals.xyz @feliks@chaos.social They ways we, as animals learn stuff is a lot more complicated than a reinforcement learning algorithm.
-
Embed this notice
Bitals (bitals@fediverse.bitals.xyz)'s status on Monday, 27-Jan-2025 23:16:52 JST Bitals
@SuperDicq @feliks
Stuff, but not actual speech. LLMs have reinforced learning that kinda crutches that in, similar to methods available with parrots.
The difference is parrots can also fly, push buttons and be cool, all while not burning 1000W continuously. -
Embed this notice
lainy (lain@lain.com)'s status on Tuesday, 28-Jan-2025 15:03:49 JST lainy
@SuperDicq @bitals @feliks were do you guys get those parrots that can write code, paint pictures, create and play songs and know more about the world than the average person? -
Embed this notice
lainy (lain@lain.com)'s status on Tuesday, 28-Jan-2025 16:46:02 JST lainy
@bitals @feliks @SuperDicq was this post ai generated? -
Embed this notice
Bitals (bitals@fediverse.bitals.xyz)'s status on Tuesday, 28-Jan-2025 16:46:14 JST Bitals
@lain @feliks @SuperDicq
Those are all separate models trained on different things.
Parrots just have less memory, so they can't store all of that random text. They also don't burn MegaWatts of power for storage alone.
LLM don't *know* anything though. They have a huge library of tokenized text, and they will give you back the tokens most often seen together with ones you provided with your query.
1/2