@futurebird The way i feel about this is that i don't know what qualifies as conscious, but LLMs just obviously aren't it, because we *know* how they work and they *specifically* have no fucking clue what they're doing. It's literally just statistically predicting what sequence of numbers could follow an input sequence. LLMs can't be conscious any more than a desktop calculator can be.
If LLMs were actually able to see letters and words then we could start to entertain the idea of conscience.