It's basically like if they decided to put a coin-operated toaster oven in every vehicle. You don't actually want toast, but if you don't keep feeding it coins, the car won't start.
Notices by Jason Gorman (jasongorman@mastodon.cloud)
-
Embed this notice
Jason Gorman (jasongorman@mastodon.cloud)'s status on Tuesday, 14-Jan-2025 19:18:09 JST Jason Gorman -
Embed this notice
Jason Gorman (jasongorman@mastodon.cloud)'s status on Tuesday, 14-Jan-2025 18:39:08 JST Jason Gorman @dalias The thing is, if products are redesigned so they won't work without the LLM (like putting a coin-operated toaster in every car and designing the electrical system to not work without it), then whether or not we actually *want* toast will be largely irrelevant.
-
Embed this notice
Jason Gorman (jasongorman@mastodon.cloud)'s status on Tuesday, 14-Jan-2025 18:37:30 JST Jason Gorman @dalias And then expect productivity to double because they read it in Forbes
-
Embed this notice
Jason Gorman (jasongorman@mastodon.cloud)'s status on Tuesday, 14-Jan-2025 18:33:06 JST Jason Gorman LLMs are a near-perfect rentier technology. The cost of training them to the point where they're even marginally useful is so prohibitive that only people with very deep pockets can do it. The goal then is to make us all reliant on them, by shoehorning them into as many products as possible.
And it doesn't matter whether or not we actually need them if products won't work without them.
-
Embed this notice
Jason Gorman (jasongorman@mastodon.cloud)'s status on Sunday, 12-Jan-2025 06:54:21 JST Jason Gorman There's a small window of opportunity here for software engineers at Meta where they might be able to get another job in this industry.
Engineers at X? That ship's sailed, I'm afraid. But presumably the concentration camps will need IT.
-
Embed this notice
Jason Gorman (jasongorman@mastodon.cloud)'s status on Friday, 10-Jan-2025 17:26:09 JST Jason Gorman The "Here's a list of people I admire who just happen to have lots of followers" LinkedIn post is a classic of the genre.
-
Embed this notice
Jason Gorman (jasongorman@mastodon.cloud)'s status on Wednesday, 08-Jan-2025 15:57:30 JST Jason Gorman Weirdly, when I suggested to the same executives who are currently pouring $gazillions into "A.I." that they could achieve bigger productivity gains by investing a fraction of that in their people and teams, they were like "Meh. We're good, thanks." It's almost as if it has nothing to do with productivity...
-
Embed this notice
Jason Gorman (jasongorman@mastodon.cloud)'s status on Saturday, 04-Jan-2025 18:21:34 JST Jason Gorman I've thought a lot over the years about what makes "work" feel like work for me.
Mindless/reasonless? Yep. Not a fan.
Repetitive? Sure. That's a chore.
Hard? It's better when it's easy, so I'll try to make it easier.
But I think by the far the biggest factor in whether something feels like "work" is when someone is telling me what to do. Suddenly, I don't feel so much like doing it.
Weird, right?
-
Embed this notice
Jason Gorman (jasongorman@mastodon.cloud)'s status on Saturday, 04-Jan-2025 18:21:33 JST Jason Gorman That's an actual thing, though. Search for "autonomy and intrinsic motivation"
-
Embed this notice
Jason Gorman (jasongorman@mastodon.cloud)'s status on Saturday, 04-Jan-2025 18:21:32 JST Jason Gorman "But Jason, how do we stop people from doing things that work against the interests of the organisation?" Well, off the top of my head, NOT REWARDING THEM FOR DOING THAT might be a good start. If incentives are aligned with organisational goals, autonomous people will tend to find ways to give you what you want.
-
Embed this notice
Jason Gorman (jasongorman@mastodon.cloud)'s status on Monday, 30-Dec-2024 18:10:58 JST Jason Gorman Playing GPT-x at chess reveals a major limitation of Large Language Models: a complete lack of dynamic reasoning.
So many tasks require the ability to plan ahead and evaluate potential outcomes (e.g., is this a good chess move or a bad chess move?)
It's arguably the whole point of intelligence.
This is coupled with a complete lack of temporal reasoning. LLMs have no sense of history or ordering of events.
-
Embed this notice
Jason Gorman (jasongorman@mastodon.cloud)'s status on Saturday, 28-Dec-2024 18:31:00 JST Jason Gorman "Can Europe build a $1 trillion company?"
No. Because market caps in Europe are usually based on actual business performance.
There are a bunch of European businesses with revenue larger than some of these US $trillion businesses, though.
-
Embed this notice
Jason Gorman (jasongorman@mastodon.cloud)'s status on Monday, 23-Dec-2024 07:25:49 JST Jason Gorman Off to Cornwall tomorrow for a... er... Gorman Xmas. 30 minutes' walk from the cottage hospital where Mum died a decade ago, weeks after being diagnosed.
I've never been back, and with good reason. Took me a year to get back on the horse. I don't know WTF Dad's thinking. But I'm going to find out.
-
Embed this notice
Jason Gorman (jasongorman@mastodon.cloud)'s status on Friday, 20-Dec-2024 07:12:21 JST Jason Gorman A useful statistic would be what % of AI experts were once estate agents. Second one I've seen today.
-
Embed this notice
Jason Gorman (jasongorman@mastodon.cloud)'s status on Friday, 20-Dec-2024 02:04:57 JST Jason Gorman The conversation with A.I. stans typically goes like this
Me: "I'm not happy with the results I've been getting with the coding assistant. Often needs considerable coaching and then rework."
Them: "Oh, you must be using it wrong."
Me: "What's the right way to use it?"
Them: * describes exactly what we've been doing *
Me: "That's what we've been doing, and the results are poor" * shows them examples *
Them: "Okay, but does the hotel booking *have* to be in the future?"
-
Embed this notice
Jason Gorman (jasongorman@mastodon.cloud)'s status on Friday, 20-Dec-2024 02:04:57 JST Jason Gorman Every time I pull on the "Oh, you must be using them wrong" thread with "A.I." coding assistants, it seems to lead to a lower quality bar. That's the "prompt engineering" secret sauce, apparently. Care less.
I did actually say I was afraid this is how Gen A.I. would play out over a year and a half ago. I wasn't afraid that they'd be able to do my job; they clearly can't.
What I was afraid of is that execs and investors won't care that they can't.
-
Embed this notice
Jason Gorman (jasongorman@mastodon.cloud)'s status on Friday, 20-Dec-2024 02:04:56 JST Jason Gorman If only we hadn't spent the last 70 years training customers to accept software that's hard to change and doesn't really work anyway
-
Embed this notice
Jason Gorman (jasongorman@mastodon.cloud)'s status on Thursday, 19-Dec-2024 17:24:34 JST Jason Gorman Fun fact: "Agile Software Development" is 77% "Software Development"
-
Embed this notice
Jason Gorman (jasongorman@mastodon.cloud)'s status on Monday, 16-Dec-2024 10:02:41 JST Jason Gorman @thirstybear What really irks me is the opportunity cost - economic and social - of the tech industry focusing all their attention on science fiction while real problems that this kind of investment in time, money and resource *could* solve right now go unattended.
-
Embed this notice
Jason Gorman (jasongorman@mastodon.cloud)'s status on Sunday, 15-Dec-2024 05:51:55 JST Jason Gorman @aral Unless you have $44 billion to hand, of course