AI investments vs. end-user consumer spending

Earlier today, I posted an article from the Wall Street Journal that showed the circular deals of AI companies with each other.

I followed up to see how much actual end-user consumer spending compares with the AI build-out spending.

The AI companies are buying and selling actual goods and services to each other as well as investing in each other.

Nvidia and AMD sell chips. Oracle sells computers. CoreWeave is an AI-cloud infrastructure company that rents out data-center capacity. CoreWeave’s biggest customer is Microsoft, which is an investor in OpenAI, shares revenue with OpenAI, buys chips from Nvidia and has partnerships with AMD.

When Morgan Stanley analysts in an Oct. 8 report mapped out the AI ecosystem to show the capital flows between six companies — OpenAI, Nvidia, Microsoft, Oracle, Advanced Micro Devices and CoreWeave — the arrows connecting them resembled a plate of spaghetti.

The problem is that they are selling goods and services and making capital deals with each other. The expected investments into AI infrastructure, developement and business-to-business spending is expected to be orders of magnitude higher than end-user consumer spending on AI services.

According to Google Gemini (which is based on an article I read last week so I know it’s true):

One analysis estimates that new AI data centers built in 2025 will incur $40 billion in annual depreciation while generating only $15 to $20 billion in total revenue (combining both enterprise and consumer) for the owners. The revenue would have to grow by tenfold or more just to cover the depreciation and achieve a target return on capital…

The AI build-out is so massive that the AI-related capital expenditures (i.e., the cost to build the capacity) have been reported to have contributed more to U.S. GDP growth in the first half of 2025 than all traditional consumer spending combined.

This illustrates a fundamental economic dynamic: the current growth is fueled by companies spending huge sums on infrastructure (CapEx), not by the economic activity generated by end-user purchases (consumption)… [end quote]

It’s worth reading my questions and Gemini’s answers in full.
https://gemini.google.com/app/5b59f7597eea1988

The stocks of the tech companies in question all are grossly overvalued by traditional standards even if all their revenues led to actual end-user spending. The investments in AI resemble the investments made in internet infrastructure before the 2000 dot-com bubble burst.

This is very scary to me. If (when) the bubble collapses it will drag down GDP in addition to the stock market.
Wendy

9 Likes

Don’t worry. We’re being told constantly that “this time is different” …

6 Likes

Maybe you are on to something.

GDPNow has Q3 growth at 3.9% and consumer spending (PCE) at 3.3%, which seem really high because the labor market is weak and it doesn’t seem like the consumer is strong enough to support that growth.

I’m skeptical of that 3.3% number, but I haven’t researched it, maybe it really reflects the consumer.

1 Like

Agreed. And…with margin debt pumping up AI stocks, the pop will be wide ranging and potentially devastating in the short term.

1 Like

The Federal Reserve’s data (linked below) is a bit old, but it shows a trend of household debt levels continuing to rise. It’s quite possible for spending to temporarily outpace income — at least until the loan payments start coming due.

That said, I personally have been spending well above my income level this summer and fall:

  • Two kids’ college costs — paid for from the 529 accounts that I had been investing for them for their entire lives.
  • A new-to-me (but still used) plug-in-hybrid SUV — paid for from savings / the “car fund” bucket that I have been contributing to since I paid off my last car loan many years ago.
  • Rooftop solar panels — paid for from a combination of the “excess returns” on my options investing that would otherwise be invested in bonds and a deferred compensation style bonus I earned 3 years ago and that matured earlier this year.

Of those costs, none are resulting in me taking on debt. One set (the kids’ educations) is a key life goal. The other two sets (plug in hybrid, solar panels) are actually geared towards lowering my family’s ongoing costs of living.

Of course, I am now officially tapped out… I fully expect my household’s spending will contract next year vs. this year.

Regards,

-Chuck

6 Likes

It would be interesting to see that graph plotted as a percent of household income.

4 Likes

This may or may not be relevant, but here’s a short story of how I got my hand slapped while at Westinghouse for doing something similar. In Pittsburgh I ran a radio station that had a co-owned TV station. (Separately corporately chartered, however.)

That TV station was so dominant that when it came time to do ads for ourselves on TV, I wanted to purchase time on our own TV station. Which I did. It wasn’t a lot, maybe $50,000 or so, but after the flight had run I (and the TV GM) got calls from Westinghouse attorneys upbraiding us for doing so, because we could be accused of “self dealing” (not sure that was the word, but it describes the accusation.)

Now there was going to have to be a notation in the Westinghouse Annual Report to the effect that the company hand purchased inventory from itself, so to speak, and that was annoying because, you know, big corporation.

[How big? Westinghouse Broadcasting was the 4th largest broadcaster in the country, behind the three 3-letter networks. We had radio, TV, production company, satellite backhaul for most cable companies, and an agglomeration of other odd businesses (Muzak!) and the like.

And we were, the 4th largest broadcaster in America, 3% of Westinghouse’s total sales. We were nothing, a fly speck in the corporate ledger, and now there would be to be a special notation in the AR because I bought time on the only TV station in town worth buying time on.

That, I suppose, was “circularity”, obviously paling in comparison with what’s going on now with all this AI self-dealing. Ah well, it’s not like the geniuses running these places could do something wrong, is it?

I’m asking. Is it?

5 Likes

They tell us people are paying fees for ChatGpt. ChatGPT tells us service is limited by data capacity. Google and others are supported by advertising. Cash is rolling in. How big can it get?

Over bought? Over promised? We shall see. ChatGPT plans to dominate. How will they do vs the competition? And how many specialized AI services will succeed? Legal, scientific, and probably many more are likely. I see much potential for growth. Will they be profitable?
Who can afford to do this stuff by hand when fast efficient services are available?

1 Like

AI chips quickly wear out with use, and so the depreciation (and cost) is real. Revenue will increase for some companies, as better use cases emerge. For example, internet search is changing. Google’s search algorithm, once a clear leader, now has many AI competitors, and even Google is pushing people to switch to AI search. (Google oddly keeps asking me to complete a CAPTCHA for simple searches, and so I have switched to DuckDuckGo for simple searches.)

I am being encouraged to freely use AI, and so it seems companies are searching for the killer AI app. Companies in general are strongly encouraging their employees to use AI. There will be winners and losers in this technology race, but the AI revolution is real. Advertising is paying for some of this data center build-out.

Projections for 2025:
Global GDP $110 trillion
Global advertising $1.2 trillion
Global Data Center Spending $0.5 trillion (expected to double in 5 years)

Companies with over $10B in Net Digital Ad Revenues Worldwide 2025:
Alphabet (Google/YouTube): $209 billion
Meta (Facebook, Instagram, WhatsApp): $184 billion
Amazon: $69 billion
ByteDance (TikTok, Douyin): $61 billion
Microsoft (Bing, LinkedIn): $17 billion
Alibaba: $42 billion
PDD Holdings (Temu, Pinduoduo): $30 billion
Tencent: $17 billion
Kuaishou: $12 billion
Apple: $12 billion

=== links ===

“18% The share of the total U.S. GDP represented by the digital economy (vs. 11% in 2020).”

“Google’s dominance in general information searches dropped from 73% to 66.9% in six months, indicating erosion of its market share.
AI tool adoption saw a significant increase, with daily usage more than doubling from 14% to 29.2%.”

5 Likes

A table of household debt to GDP by country. The US in #14.

Switzerland    125%
Australia      113
Canada         100
Netherlands     94
New Zealand     90
South Korea     89
Thailand        88
Hong Kong       88
Norway          87
Denmark         85
Sweden          83
UK              76
Malaysia        70
United States   68

DB2

4 Likes

Bob,

I’m not sure what point you were making. (perhaps this was simply a reference with no point of view?)

When I looked at it, I immediately thought: Is it the denominator or the numerator that matters?

This had me look deeper into the tables to find that Switzerland has very low debt to GDP numbers:

Switzerland Government Gross Debt to GDP

The above linked information suggests that the answer may not be to the right question.

Regardless, I’m not sure this content is DIRECTLY on topic for this thread.

3 Likes

What???

The Chips don’t wear out, rather when next generation powerful chips come, people will use them, instead of the older generation. So, companies that lease these chips cannot use standard 5~7 year depreciation cycle.

The post is riddled with inaccuracies and fallacies.

2 Likes

How many times in the past, you have seen company that launches a product and within 2 years achieving over $12 B run rate and 800 M active users???

Those who are blindly naysaying don’t get it… the bubble phase requires much higher valuation and current valuations are within normal, historical level.

2 Likes

This is how most folks react to the data that is not aligned with their internal feeling. It takes lot of effort to just follow the data, especially when our “what it should be” is very different than “what it is”.

At the least entertain that you are early and need to be patient for “what it is” to get to your “what is should be”

2 Likes

Absolutely they do. From perplexity.ai

Yes — large-scale AI users can wear out GPU chips prematurely due to sustained heavy use, especially when GPUs operate near their thermal and electrical limits for long periods. In modern AI data centers, this happens quite often because generative AI workloads run continuously at very high utilization rates.
Why AI Workloads Stress GPUs
Datacenter GPUs like NVIDIA’s H100 and Blackwell series regularly operate at 700–1,000 watts each. Sustained maximum utilization imposes stress in several ways:
• Thermal cycling: Constant heat (often 80–90 °C on-die) accelerates material fatigue and causes microscopic cracking of solder joints and interconnects.
• Electromigration: High current densities over time cause gradual atom migration in the chip’s metal layers, degrading conductivity.
• Memory degradation: High‑bandwidth memory (HBM3/4) and GDDR6X modules experience wear from continuous access and thermal exposure.
• Power delivery wear: Voltage regulation and capacitors near GPUs are strained by fluctuating loads typical of AI training bursts.
These mechanisms collectively shorten GPU service life, especially under continuous 24/7 operation typical of AI clusters.
Reported Lifespans in AI Data Centers
A principal AI architect at Alphabet reportedly stated that data center GPUs operated at 60–70% utilization typically last 1–3 years, depending on cooling and environment. This is far shorter than the 5–7 years expected under normal workstation use. The shortened life is not because GPUs “burn out” instantly but because mean time between failures (MTBF) drops significantly when thermal and electrical stress remain constant.

9 Likes

The entire point of depreciation cycle is not because they wear out. Even assuming you are using the chips 24*7, a typical H100 chip will last at least 5 years and assuming you are using it exclusively for pre-training it will last 7 years. So, when coreweave uses 7 years, which is aggressive, it is technically correct, but, the real issue of depreciation comes because of newer chips, not because the chips have lost its useful life.

Let me give you an example, AWS depreciates intel chips, not GPU, but CPU’s which are not necessarily heavily used, in 5 years. They used to do that in 6 years, and switched to 5 years. Why? Because they were actually replacing the old servers at that rate, not because the chips started wearing out.

Wearing out is very different from economic life. Even a better example, typically companies use 5 years for depreciating the cars. Is that economic life of a car? Absolutely not.

2 Likes

In my previous life, I am intimately familiar with the depreciation math, because hardware refresh cycle is one of the best use case for migrating workload to cloud.

No. It won’t. You aren’t listening. These chips DO WEAR OUT when used in these use cases. Read the above again. Slower this time if you need to.

3 Likes

In my current life I work for (drumroll) Nvidia…

3 Likes

You are relying on some internet search result… I lived that math… I know what I am talking about. it is an part of every TCO conversation…