#LLMs are based on language. Do we think the basis of truth is in language? #philosophy #neuroscience
Conversation
Notices
-
Embed this notice
Peter Moleman (molemanpeter@neuromatch.social)'s status on Wednesday, 05-Jun-2024 20:53:18 JST Peter Moleman -
Embed this notice
Eaton (eaton@phire.place)'s status on Wednesday, 05-Jun-2024 20:53:17 JST Eaton @MolemanPeter even more specifically, do we believe that truth is a structural quality, something that exists as an identifiable pattern in the words themselves rather than the meaning they were assembled to communicate. LLMs reproduce the structures but rely entirely on the “echoes” of past meaning
-
Embed this notice
Eaton (eaton@phire.place)'s status on Thursday, 06-Jun-2024 00:13:39 JST Eaton @MolemanPeter I feel like I'm unqualified to say whether it's the BASIS, but at the very least I think they're closely related. I think the key for me is that linguistics isn't about truth; even at its most rigid it's about structural correctness, and there are many kinds of structurally correct statements that are untrue. (magritte etc)
LLMs by definition work towards statistically probably structure and composition — for "truth" they rely 100% on the meaning imbued by past structure-makers
-
Embed this notice
Peter Moleman (molemanpeter@neuromatch.social)'s status on Thursday, 06-Jun-2024 00:13:40 JST Peter Moleman @eaton Is the basis of truth in meaning?
-
Embed this notice
Eaton (eaton@phire.place)'s status on Thursday, 06-Jun-2024 00:14:40 JST Eaton @MolemanPeter So, maybe I’d say that truth can be communicated via language, but the true-ness of it is not a carried property of the linguistic artifact itself?
-
Embed this notice
Eaton (eaton@phire.place)'s status on Thursday, 06-Jun-2024 01:42:36 JST Eaton @MolemanPeter I'd lean towards yes. Even LLM-based that aim for truth/accuracy (as opposed to inoffensiveness/risk-mitigation) are doing so by integrating other tools and approaches — maintaining a parallel "repository of truth" in the form of documentation or ontological relationships, building answers with that system, then using the LLM to turn it into conversational text.
-
Embed this notice
Peter Moleman (molemanpeter@neuromatch.social)'s status on Thursday, 06-Jun-2024 01:42:37 JST Peter Moleman @eaton So your answer is: No LLMs can not be the basis of truth, I suppose?
-
Embed this notice