Embed Notice
HTML Code
Corresponding Notice
- Embed this noticefunny to see one research line talk about the neocortex only having about six layers--a different unrelated group stumbled on a similar theory that only a small amount of the brain participates in learning.
the biologically inspired models share issues that they work great but do not scale vertically. :blobcatovo: i somehow knew that we'd learn more by studying the limitations and amusingly we're getting more research that deep learning is a meme and the algorithms break down in weird ways
not sure how to reconcile the self organizing maps though. there seems to be a weird amount of success in putting a (large) self organizing map in front and a small learning layer behind it.
:neocat_thonk: still somewhat curious if KANs are worth trying to mix with a spike net. i have no real theory why we should do this. it just seems like something that should be tried.