Embed Notice
HTML Code
Corresponding Notice
- Embed this notice
iced depresso (icedquinn@blob.cat)'s status on Sunday, 18-Feb-2024 16:49:59 JSTiced depresso @lizzie i think the brain is sparse connection sparse layer. most networks trained are dense for some reason.
numenta has a paper about 'complimentary sparsity' and how making both elements sparse is letting them cut the majority of processing out of the models.
i'm reading the top-kast paper rn that talks about how they were able to shimmy sparse layers in to pytorch