Dark matter/dark energy is the modern equivalent to "the aether" in earlier cosmological models.
We haven't yet figured out what we are missing in the current models but there's obviously something not right. Plugging in an imaginary substance makes the math work, but anyone who doesn't depend on his grant money requiring it is pretty sure that it's just a shorthand for "we don't know yet."
@JoshuaSlocum@TrevorGoodchild@Jewpacabra@LukeAlmighty These niggas can't even figure out matter. When the math breaks down they blame "dark matter", a substance they've never seen and shouldn't exist, because their gravity silly string equation says it's there.
@PunishedD@TrevorGoodchild@Jewpacabra@LukeAlmighty dark matter hangs around because "matter we can't see or detect" makes more sense in the observations than the more esoteric explanations the maths working out to galactic scale means the math is pretty good; newtonian gravity failed at the solar system scale doesn't really have anything to do with string theory per se but it's hard to explain and it sounds retarded, agreed
@IlDuWuce@TrevorGoodchild@LukeAlmighty@Jewpacabra (Evolutionary) biology is even worse. Your entire degree hinges on glorifying Darwinism, and once you're finished all your job offers are about expanding the Darwinian fan-fiction. And then when the origin of mankind comes up in debate people will say "all the scientists [who are selected based on accepting Darwinism and whose livelihoods depend on Darwinism] are Darwinist, therefore Darwinism is true :brainlet:"
@TrevorGoodchild@LukeAlmighty@jewpacabra I used to explain people in high school how gravity isn't real as a joke. Little did I know I was so close to the truth back then. Modern-day physics is an immense LARP by people who HAVE to act like they get it because they are prof. dr. phd cumstain laude and regular people wouldn't get it if they just answered "dunno lol". Also the physics dept. is filled with a lot of fags these days I'm afraid.
@JoshuaSlocum@WilhelmIII I seriously wonder if this will be done by an AI that we create someday. It will figure physics so advanced we barely understand it, if we ever do. Like explaining quantum gravity at a five year old's level, only we will always be five years old.
We might still be able to use it and travel to the stars. But we would never truly understand the technology.
I expect that in the end dark matter, dark energy and aether are all going to be neighbors in the midden of bad ideas alongside "X-Rays help you shop for shoes" and "lobotomies cure depression."
@WilhelmIII i try to keep dark matter and dark energy segregated, as they are way different, and are dealing with completely different things the mofos who named them so similarly prob regret it now also, you haven't heard? "the aether" is still kicking it on web sites that look like they came from 1997
@Snidely_Whiplash@WilhelmIII@JoshuaSlocum@petra Well, that is kinda wrong. The AI does have a model literally based on scientific method. (Observe, create a theory, measure, assess difference and alter theory...)
Therefore, it does come up with its own descriptions of what it does observe. We didn't teach it what a car is, we only showed it 100 000 000 pictures of cars and not cars.
So, I believe, that physics might be the best problem for AI to solve, since we are working on it in the exact same way. All it needs is its own interface to observe. We don't need it to create a new physics, but to analyze what physics is.
AI can, quite literally, only repeat what it's read. It has nothing whatever to do with intelligence. AI-generated science will turn out the equivalent of the AI drawing of a woman with big tits and 3 hands. With the deluge of Indians using AI too generate training content for AI, we will shortly reach the point where AI is widely seen as the joke that it is.
The problem of "today's" AI is not a level or implementation problem. It is that what they are producing is not intelligence. AI has nothing to do with intelligence. What they have produced is a very expensive engine that can generate pertinent answers in grammatically correct, though often simple, English. This is not a small achievement, but it is not intelligence. You claim "There is not reason we will not create intelligence someday." This is a very bold assertion, one with no evidence or thought behind it at all. Can you even define what intelligence is? The AI industry gave that pursuit up in the 1980s. I would assert, quite baldly, that computers, being what they are, and given how they work, will never ever ever, on a fundamental level, be capable of intelligence, no matter how much programming you put into the effort. BTW the Turing Test is ontological nonsense.
@Snidely_Whiplash@petra@WilhelmIII >Can you even define what intelligence is? The AI industry gave that pursuit up in the 1980s. they more or less went with "pattern recognition and memory" which isn't so bad as far as it goes, and works pretty well for what they call AI these days it maps well enough to what we think of and measure as human intelligence, which is why an AI engine that can produce a JPEG of Hitler riding a dinosaur gets billions in funding it's intelligent enough that it can nearly replace, i dunno, 20-30% of normies who, let's be honest, aren't all that intelligent; they're just pattern recognition engines with memory it's a leap from there to the even more nebulously defined "genius", which is what would be required for a jump from chemical/atomic reaction to FTL travel
There is no reason we will not create genuine intelligence someday. Or something that approximates closely enough whether it's self aware or not. After that it seems to be just a matter of scaling that intelligence. There is no fundamental bound on intelligence that we know of.
And the "set of lines in relations to each other" is one kind of an abstraction for a car.
I get your criticism, but the issue with AI is, that it has no concept of itself, and therefore it cannot has its own goals. Literally all learning is from a human set utility function. And that is a fundamental problem, that will not be solved by just more nodes. But to say, that it cannot abstract ideas is just wrong.
No, it does not know what a car is. It cannot abstract the idea of "car" from it's training database. It can't even abstract the idea of "object" from its database. This is why it's so terrible with hands. I has no abstract idea of what a hand it, what it does, what it looks like. It can only concatenate several millions of images tagged with the word "hand" and calculate an average of what lines, shapes and coloration correspond to that tag. But because hands are so mobile, fluid and expressive, the examples do not do much to constrain the image generator. Likewise, it has no idea of what a "car" is. It has a list of compositional elements, shapes, lines, curves, etc that correspond to the tag "car". The image generator has no concept of what a car is, or even what a car looks like, because concept is itself outside the programming. This is a category error. And AIs can't observe.
@picandor@TrevorGoodchild@LukeAlmighty@Jewpacabra Learning about the-rape-utic doses is wild versus non-therapeutic is kinda wild. Shit doesn’t behave linearly, and there’s probably some stochastic explanation somehow lol.
(Also that there is a drug called Lovenox amuses me)
"Bro all it is: below this line nothing happens and they're in pain, above that line and they die. Why? We don't fucking know. We're just glad it happens. How? We don't know, we have absolutely fuck all of an idea. We've stopped asking questions because the answers we find are scary so we're content with our little lines and stop asking questions."