When I first started programming in the late 90s, everybody cared (at least a little bit) about performance & resource use.
Today, I have to explain to junior devs that they should use an O(<n) algorithm instead of an O(n^3) one -- on a web app (so, a system that uses a text-based protocol & 3 different languages to remote-edit a rich text document to simulate the OS's built-in widgets), & why we don't need to spin up a whole VM for their 10 line python script.
Weirdly, this change seemed to happen at exactly the time that new machines stopped getting twice as beefy every 18 months & people started moving more and more of their computing to dinky resource-strapped pocket devices.