LLMs are a near-perfect rentier technology. The cost of training them to the point where they're even marginally useful is so prohibitive that only people with very deep pockets can do it. The goal then is to make us all reliant on them, by shoehorning them into as many products as possible.
And it doesn't matter whether or not we actually need them if products won't work without them.