thinking about LLMs in the long term, the language evolutionary long term, where until now language evolved alongside people, words shifting meaning and being created or lost because of changes in culture. But now as more and more of language by volume is overwhelmed by LLM slop, the near-semantic misses like "Crouched Fingers" will accumulate and make new norms.
https://neuromatch.social/@jonny/113525654645929633
That's not to say that there is such a thing as "wrong" language per se, nor to say that LLMs interact with language in any way that resembles the way humans do - sort of the opposite, that language itself will shift on a new set of axes and constraints that are wholly indifferent to human usage of language except as filtered through their profitability by their owners and secondhand through increasingly mixed human/llm training data. capital already does and has always imposed some forcing direction on language, being uh a salient fact of our culture - advertising is obvi a pretty big deal. i don't have words to say why it just seems different to have language across domains just completely swamped by tech companies.