No. of Recommendations: 3
Current neural net "AI's" implementations, with their huge computing and data center power needs, are horribly inefficient compared to the human brain operating at 20 watts.
It is not each operating AI out in the field that needs those gigantic multi-gigawatt computing centers. That is for TRAINING the AI. Once the AI is trained, the result of the training can be used in MILLIONS (or BILLIONS or TRILLIONS) of copies, for many OOMs (google it) of operations. So TESLA's big training center will at some point train a driving AI that finally does what is needed. That training AI will be copied on to many MILLIONS of cars, with each car spending something like 100 Watts to run its self-driving computer for that car, and that is with CURRENT technology. To put 100W into context, you could run the AI for about 8 cars on the power needed to run one blow-dryer.
As to the other applications of AI. the chatGPT on your laptop or phone runs the equivalent of a few minutes at 20W to answer a query. In my experience, the answers it provides would have taken me 10 to 15 minutes to produce. So I think chatGPT is already more efficient than human intelligence.
And we are just barely at the beginning.
R: