Conversation
Notices
-
Embed this notice
the global energy demand for computers doubles every 3 years while the global energy production increases by 2% each year. We're quickly approaching a situation where people's ability to access Facebook might stop others from being able to heat their home.
-
Embed this notice
The way to create more efficient computers is new computing paradigms like heterogenous computing, and a return to analog computing as both of these solve the major energy problems in current computing: memory access, and the digitization of data. However, this require's brand new computing hardware, and from what I've seen the new hardware is no longer protecting programmers from hardware implementation details. Meaning that a lot of our current code will need to be rewritten. New languages, and frameworks will be created, and a lot of current programmers will have to learn about low level hardware.
Our current computing tech level is probably going to be contracting in 5-10 years until we can build back up again.
-
Embed this notice
There's a bunch of reasons for this like IoT and smart devices being always on, and the rise of AI/ML in everything. I've personally witness probablistic ML techniques being more energy efficient than algorithmic techniques, but I guess that doesn't scale. In addition the energy cost of training AI is insane.