Conversation
Notices
-
Embed this notice
PSA: Advanced autocomplete cannot be your friend, and anyone who says otherwise is selling you a pack of lies, with a spoonful of surveillance.
-
Embed this notice
@feld @taoeffect I have seen people on AOL Instant Messenger enraged by communicating with an ELIZA bot so at the very least, advanced autocomplete can be your enemy
-
Embed this notice
@feld I see you avoided answering my question, but continued to use words and phrases like "fool our primitive brains" and "real enough", suggesting that you understand that there is in fact a difference between something that is real and something that is imagined.
-
Embed this notice
@taoeffect there is no such thing as a universal shared concept of reality, though. That's the problem. Otherwise we wouldn't have all these different religions. Humans just don't work that way.
"reality" is very much a complex, nuanced, and divisive concept.
You can't tell a human that they didn't feel or experience something.
edit: how do I even know that YOU are real? I've never looked into your eyes before. I've never touched you. So far, I have only had a relationship with you over text. You could, indeed, be an advanced AI chatbot.
-
Embed this notice
@taoeffect One's own understanding of "reality" is whatever your brain perceives it to be based on the various sensory inputs it's getting.
If I see a goblin standing in the corner and I can talk to him, my brain is absolutely convinced it's real.
If you're in the room and can't see him, it doesn't make the goblin less real for me.
If a group of people disagree that the goblin is real because they cannot see it, it's probably not there. That's the agreed-upon shared reality.
But the experience is not less real for me because you are not experiencing it.
That's the tricky part about consciousness and reality. All that matters is that my brain can think it's real, and that's good enough.
The same hormones can be released whether the relationship is with an imaginary friend/hallucination, an AI program, or a real person.
So if you can fool our primitive brains into thinking something is real, it's real enough to satisfy us. That's all that matters.
-
Embed this notice
@taoeffect some people have imaginary friends. Schizophrenics talk to people who aren't there. Religious people think they're talking to Jesus.
All that matters is the brain thinks it's real.
-
Embed this notice
@feld Are you saying that imaginary friends are real?
-
Embed this notice
@feld There is not a living being there. When you communicate with a real person online, you are (presumably) communicating with a living being. LLMs are not living beings. They are a collection of weights, trained by a living entity, used to predict the next word based on that training. There is no life in them, just an echo of a former life. Advanced copy/paste. LLMs and transformers are not a living person - they are a shadow of the output of living persons.
-
Embed this notice
@taoeffect there exist people who only have meaningful friendships with people they've never seen or met and they only used text-based communications. So sadly, I think it can be your friend.