@thomasfuchs And sometimes bullshit is what you need! The bad things happen when people believe they can generate better stuff, or think "real intelligence is just around the corner."
And: It might be! Even two years ago, I didn't think they'd be generating text as parsable as they're generating now. I remain genuinely surprised.
But I also suspect there's something much more going on in humans and training more parameters on more data is likely to yield diminishing returns.