The New York Post: Brainiac scientist proves that man is better than machine, wins million-dollar fortune. Read below [ad break]
American Scientist: 2035 Breakthrough Prize awarded for mathematical bound on the capabilities of neural architectures
Quanta Magazine: Renegade genius proves that neural architectures are arbitrarily well-approximated in polynomial time, declares "human insight is back on the table."
The actual arXiv preprint: In this paper, we approximate the output of a TWNNK (traditionally well-behaved neural network kernel) by an oracle that terminates in O(n^1384275904375081326409813407498674812650982174937509813745098630897019875108374985732049875943207987509183740918732409170) steps
@cine@allison@hakui I strongly disagree! I see the success of AI/ML as marking the end of "math" as a subject, insofar as it is defined to be 'the pursuit of human understanding of form.' The best articulation of this is http://www.incompleteideas.net/IncIdeas/BitterLesson.html , which explains why taking away human insight has been the key to AI/ML progress, and why the people most disappointed by this have been the "mathematicians" with all their fancy ideas. I beg the whole world to read this essay
@allison@cine@vitalis@georgia Wholeheartedly agree. I would add that LMs are even worse than ouija boards and musical dice games because they can be run without randomness (after the training is done). With the older summoning methods, the celebrant at least participates in the creation of new entropy, which has some mystery to it because entropy bleeds across domains. But the AI-user is absorbed into something frozen. Their sacrifice changes nothing. Like sitting down to argue with the guy holding a sign that says "change my mind" who never changes his mind about anything
@gav@karna@genmaicha Once again gav is an untutored math genius and I agree with him
"a variance is the mean of [SQUARED] differences of a set of numbers from their mean" is fine. An expert will understand that 'mean' is defined as an expectation and usually calculated as an integral, 'set' should specify the probability measure, etc. Covariance is just when the data is higher-dimensional, so the 'squared differences' must also live in a higher-dimensional space.
I have not heard of "mean" implying "finite state space" even to the layperson. Most high school calc classes ask students to calculate "averages" or "means" with integrals
Objection: "but the encyclopedia is trying to give the most general definition, useful to professionals, not just one special case known a century ago"
Is there a general definition, though, which is not essentially that? Specifying the method of calculation (taking integrals) or interpretation or purpose or generalizations (covariance) does not belong in a formal definition.
BTW, this is the main problem with the excessively long """definition""" of the page https://en.wikipedia.org/wiki/Coefficient_of_determination#Definitions , where the authors lump in interpretations and relationships with other concepts. These things do not belong in a (ideally, short) definition.
Objection: "but there are multiple definitions of R2, and this causes issues in stats literature!"
The multiple definitions have to do with how the phrase "variance of the data" is defined (e.g. centered or uncentered, that is, whether you subtract off the mean). This is not addressed in the article at all
The problem with the spinor article https://en.wikipedia.org/wiki/Spinor is that it is trying too hard to be accessible to *nonexperts*, and ends up being too wordy for experts and too vague for the layperson. How many paragraphs does it take to get to "a spinor is a representation of the spin group [or, if you like, some clifford algebra]"? Someone who can't understand these words or look them up will not understand the article anyways. And if you lead with a clear definition, at least the layperson knows what to look up (I think this is gav's point).
If you dig around in wikipedia talk pages, you see that even terms like "unit vector" and "SO(3)" are deleted from articles in an effort to get them approved as 'featured articles.' The idea is that to be a featured article, it must be accessible to a broad audience. But replacing jargon with vague circumlocutions just makes things worse for the layperson *and* the expert. This is the shittiest thing about science wikipedia
The articles which are truly written "primarily for professionals" as you (karna) are saying, I feel are actually great and have none of the issues that gav is complaining about. I just wish that was more of them. Often these articles are more niche, rated "low-importance", and are thus unmolested by wikipedia super-editors and their WIKI: policies
1. scroll past the intro section and look for a definition section 2. if there is something that syntactically looks like a math definition (reasonably short, symbols are defined and then used in some statement), then read it 3. if there is no definition section or the """definition""" is not actually a definition, then curse the Marxist Left and move on
@grips@BasedLunatic@Owl My name is Diceynes Lunatic. I'm 43 years old. My apartment is on the southern coast of Spain, where all the villas are, and I am not married. My mother takes care of all my needs, cooks for me, and then leaves. Today I am eating Iberian pig's face cheeks in a delicious sauce. I have gourmet food every day. I don't smoke, but I occasionally drink. If you are thinking of maybe marrying me for my assets, I have to inform you that I am not interested in anything other than my daily swim, coffee in the morning, and a beer midday at the bar. I am not prepared for entering a relationship with you or anyone but, if I did, I would surely win.
@hidden@J This is quite literally why the "asians are short" stereotype applies to the first generation immigrants who lived through Mao but every subsequent generation gets taller and taller. Our child would be 7 ft tall
@cine@hidden@scathach I would rather spend 10 hours reading DFW’s made-up frenchisms than spend even 1 hour memorizing real chinese phonosemantic compounds that are neither phonically accurate nor semantically specific
@cine@hidden@scathach True, if you insist on learning languages in a ‘logical’ way, you end up knowing a lot of linguistics and very little of language. I fall into this trap too easily