AI will be using a lot less energy than previously forecast. A lot less hardware than previously forecast. Will create a lot less revenue than might have been previously forecast.
NVDA is losing sales not because Deepseak will make a dent. Because other developers will find ways to make a dent.
The number of cost-saving maneuvers out there is unknown. There are plenty of them in all likelihood.
The ultimate AI operating system grabs up all the apps as Microsoft used in days of old, and sells them as a package.
But in the days of āas a serverā that might be impossible.
There was a certain growing computational load that is now going to see greater and greater efficiencies than expected.
Will AI be advanced faster? Yes.
But no killer app so far.
Honestly about the killer app comments. I do not know that. It is a grey area. I am repeating, parroting, another person who claims that. So name a killer app or two if it is wrong? But show me the money.
Cheaper to train AI models means it costs less, so more companies will offer AI this and AI that. More people will use AI to do frivolous things. More energy used.
We were going that way regardless. You can not pretend we will use any more than we would have. The theory that we were holding back otherwise does not hold any water.
I have long suspected that IRR planners would quietly shelve the mini nukes. Now that is on more than likely in many cases. They do not make economic sense.
Google AI
It appeared to have similar functionality as OpenAIās ChatGPT chatbot, which can do things like write poetry when queried. DeepSeek says its model uses roughly 10 to 40 times less energy than similar U.S. AI technology ā a reduction that seemingly would sharply cut the need for energy-gobbling data centers.