Conversation
Notices
-
Embed this notice
> The human brain has about 20 PFLOPS of compute. I’ve written various blog posts about this. Sadly, 20 PFLOPS of compute is not accessible to most people, costing about $1M to buy or $100/hr to rent.
it does not
i'm not sure you can really compare what a brain does actually. the two models are very different
-
Embed this notice
@icedquinn $100/hr to rent is cheaper than many humans, too
-
Embed this notice
it is to an extent a system of shitty semiconductors which seem to *mostly* communicate over a binary protocol on top of this, and even the calculations it does are for the most part basic as crap. it doesn't really 'compute' things in a math sense. it learns heuristical patterns and repeats those.
basically the computation is heavily fake
-
Embed this notice
@lain i dunno. i think this is not a correct way of thinking about AGI.
-
Embed this notice
@icedquinn yeah i think so too, it's not about pure flops
-
Embed this notice
@lain i drank way too much caffiene and read a stack of papers on spikenets. those guys seem to be pretty close to correct.
they also seem to be on the cusp of understanding what a hippocampus does which is pretty much the game over yeah condition of this whole ordeal.
i have some theories. i just need to stop being a lazy fuck and fix my computers :neocat_pensive:
-
Embed this notice
@lain pulse trains are just fucking weird though. "haha i have to run the whole network several times to get answers from it" cause like, you NEED to do this. all the "oh but this is equivalent to this other thing please let me do this i need to do this bro i NEED backprop bro" actually breaks it. you have to maintain the binary-on-shitty-semiconductor status.
i know this because some navy lab tested it and they found the spike net maintained indefinite stability on reinforcement learning tests, and the google shit did not do this
-
Embed this notice
@lain another fuckin weird thing is that I/O is apparently done by just repeatedly flagging spike trains.
people tested this theory (digital vision systems, a funny camera) and its weirdly effective in that the camera's reconstituted footage was immune to motion blur and other things. they also tested recognizing visual events that way, where the network is not fed an image but only pulses that indicate if a pixel went up or down in intensity. audio was also tested this way, where audio is broken in band passes and then they send tuples of which band going up or down.
part of the brain's low energy use is based around this. the DVS systems actually reached about 2x the biological power draw--contrast to sending full sync-locked frames.
so we have them doing competitively, we just have not had anyone publicly make them self-train in a sandbox.
i think we are actually eerily close to building sophonts but i am somewhat alone in this belief. most of the people trying are doing stupid shit like just buying more computers and talking about pflops or nuclear reactors to power GPUs. they're not actually looking at it as an algorithmic problem that can literally just be solved with just a little bit of evolutionary coding and a gamer box lmao
-
Embed this notice
@s8n @lain i'm only a 120-130 and i'm not even -that- good at math.
it will be an embarassment on mankind if i'm the one to do it
-
Embed this notice
@icedquinn @lain the people you're referring to are incapable of designing novel algorithms, just like the majority of machinists are incapable of engineering a new part. They can only make a better part because a better machine makes better parts that make better machines and they've been iterating on that idea since the invention of the lathe. The number of people capable of invention is tiny
-
Embed this notice
@s8n @lain there's a pattern language to inventing shit. that's what TRIZ was all about.
one of the better things to come out of the USSR
-
Embed this notice
@icedquinn @lain ability to invent doesn't seem to be tied to intelligence