@thomasfuchs It's utterly bizarre to me that a person who is actually in possession of a human brain could misunderstand it so badly as to think Spicy Autocomplete is anything like it.
Conversation
Notices
-
Embed this notice
Thad (thad@brontosin.space)'s status on Wednesday, 24-Apr-2024 13:30:24 JST Thad -
Embed this notice
maegul (maegul@hachyderm.io)'s status on Wednesday, 24-Apr-2024 13:30:34 JST maegul As with economics/finance and crypto, AI is driven by the childish hubris of the tech sector to think they're above all of the other fields of expertise.
Otherwise, what strikes me is the urge to rush into an obvious ethics problem. Once the machine is near human sentient/AGI, then ethics dictates you have to be humane to it, which is not what capitalism wants from its machines (see Human History™).
-
Embed this notice
maralorn (maralorn@chaos.social)'s status on Wednesday, 24-Apr-2024 23:12:23 JST maralorn 1. There is a huge difference between sentience and intelligence. If something kills me I don’t care if was sentient if it was smart enough.
2. Saying that we don’t understand something because "quantum" makes no sense. Actually, modern computers work on quantum effects.
3. I am not aware of a scientific argument that anything physical (like a brain) cannot be simulated by a computer.
That being said, it might still be super hard and long of, so I am also sceptical of the hype.
-
Embed this notice
crabctrl (katp32@mastodon.social)'s status on Wednesday, 24-Apr-2024 23:17:36 JST crabctrl @thomasfuchs and yes, we don't fully understand the human brain, but that doesn't mean physics doesn't apply. that's the same logic people use wrt perpetual motion machines and such nonsense; "oh, we don't fully understand it, that means anything's possible".
no. it does not mean anything is possible. nor is there any reason whatsoever to believe human brains are in any way special. they're bags of chemicals. really complex bags of chemicals, but still bags of chemicals, not magic.
-
Embed this notice
Prema Marsik (prema@hachyderm.io)'s status on Wednesday, 24-Apr-2024 23:17:36 JST Prema Marsik @katp32 @thomasfuchs i generally agree, but simulating complex quantum physics (manybody systems) is damn computationally expensive, and typically not done with machine learning. This is the disonance.
-
Embed this notice
crabctrl (katp32@mastodon.social)'s status on Wednesday, 24-Apr-2024 23:17:37 JST crabctrl @thomasfuchs come on. you know that's not how this works. "quantum effects" is handwavy pseudoscience but also irrelevant as proved by my dear friend Alan Turing.
is an LLM capable of human intelligence? no. are humans intrinsically special and impossible to simulate without handwavy unscientific bullshit like "souls"? no. that's absurd.
a sufficiently large computer can simulate *any* physical process. humans are, in fact, based in physics, not magic.
-
Embed this notice
crabctrl (katp32@mastodon.social)'s status on Wednesday, 24-Apr-2024 23:17:37 JST crabctrl @thomasfuchs and no, you cannot handwave away physics with "<something something quantum physics>". quantum physics is *physics*, not magic. even if human brains do work using "quantum physics", that doesn't mean it can't be simulated by a Turing-complete system because, again, it's *physics*, not magic.
quantum computers, too, can be emulated with classical computing. actual quantum computers are just faster, they aren't magic, they aren't more powerful.
-
Embed this notice
Thad (thad@brontosin.space)'s status on Thursday, 25-Apr-2024 01:02:56 JST Thad @thomasfuchs That's part of it but I've heard it from people who really should know better, people who at least have some technical understanding of how LLMs work.
I've had people tell me with a straight face that all *our* brains do is match patterns, and it kinda baffles me how somebody could fail so badly to understand their own thought processes. (Let alone, y'know, various autonomic processes that are entirely separate from conscious thought.)
-
Embed this notice
Thad (thad@brontosin.space)'s status on Thursday, 25-Apr-2024 01:37:50 JST Thad @thomasfuchs That and these discussions start to feel more like religious debates at a certain point. A lot of folks seem to be taking the inevitability of general AI on faith and working backwards to try to justify it.
-
Embed this notice