Anticipating a reversal in AI/semiconductors

Why do you think that?

Nvidia revenue for the financial year 2021: 26.9 B.
Gross profit: 17.5B.
EPS: 3.91

Nvidia revenue for the financial year 2022: 26.9 B.
Gross profit: 15.3B.
EPS: 1.76

I can’t address your question because your presumption of ‘growth doing well’, is contrary to reality as shown in the accounts.

NVIDIA had a spectacular year during covid because of lockdown spending on gaming PCs, and a massive crypto bubble (which is now in reverse and causing used GPUs to be dumped cheaply on the market).

That’s not well managed growth, that’s a company finding a lottery ticket on the ground. They responded to it by raising prices and margins to the maximum and screwing their customers, and killing the golden goose in gaming.

Now Intel is competing with them hard for market share in PC gaming and shortly in AI accelerators too (I have no position in Intel). A huge number of PC gamers have migrated to consoles and handhelds instead (almost all of which use AMD GPUs). (I have no position in AMD).

Having said that, I think NVIDIA will have a bumper year in 2023 from datacenter sales, offset by weak GPU sales unless they come out with something radical* or unless recession arrives earlier than expected. I think they will once again be left with a glut of overpriced and rapidly aging inventory they cannot shift (currently they are placing rush orders with TSMC). And I think Intel, AMD, and various cloud offerings will benefit the following year, from having newer and hopefully better inventory to offer based on more modern production.

In any event it will be hard to guess what NVIDIA’s earnings are going forwards, because when a hype/mania surrounds a company to this degree, it becomes in the C-level’s interests to find ways to ensure the figures in the books are as good as they can be in the current year, regardless of what’s happening on the ground, or what happens to the long-term.

  • radical would be e.g. an equivalent to the 1080 ti, 2070 super, or 3080, each of which were regarded as excellent ‘great value / high-end’ cards of their era. A well-priced 16GB 4070/4080 class card, supplied in great quantities, could achieve this. e.g. taking unsold/poorly binned AD103 chips, giving them 16GB of ram for $1000, or repricing the 4080 down to that level, would be a quite popular premium product. At the low end they really need something to replace the 3060 12GB, which is the most popular gaming card registered in Steam (biggest gaming site). The freshly announced 4060 ti 16GB has the potential to do it, but the price is too high. It needs to be $100 less, at least, maybe even $150.

However NVIDIA may be sniffing their own AI/datacenter farts to such an extent they refuse to ‘waste’ good chips or concede margin in gaming by offering fair valued consumer cards again. In fact I’m getting the feeling NVIDIA, having been nursed to greatness by gaming consumers for over 20 years, suddenly doesn’t give much of a damn about them at all.

In which case, AMD/Intel will be given further opportunity to catch up and gain marketshare. If NVIDIA concede gaming sales to Intel/AMD in order to chase present business AI spend, they will see a short-term boost to profits, being one of the few games in town for business AI, but their sales/profits will become more cyclical aligned with the business IT cycle. Whereas currently they benefit from not specifically aligned with either consumer spending cycles or business spending cycles.

There’s also an issue of brand. Gamers and datacenter techies and AI researchers are the same people! That’s a big part of why so many gaming GPUs ended up in datacenters so easily, and why AI researchers had easy access to powerful accelerators. They are gamers. NVIDIA is currently annoying gamers such as myself with their greed in relation to product lines recently. I’ll consciously try to put money to Intel/AMD as a result of that, both in my personal and professional work; good value, and concern for the customer base, is something I pay attention to. And I know most other techies feel the same way. It’s not like buying a car.

In short:

2023 : good year

2024-2026: no idea, could very easily be a re-run of the bubble/crash of 2020/2021/2022 financial year reporting. or could be steady growth.

long-term: every company in the world with any capability to make tensor chips will be rushing to make them currently, and the hype that is boosting nvidia today may be their doom in 5-10 years time as something new and hotter comes along.

As NVIDIA replaced Cisco/Intel as the key datacenter spend, another even more AI/tensor-calculation specialised company may come along to replace NVIDIA in turn. Or we may see ‘back to the cloud’ with better GPU cloud offerings in the cloud.

3 Likes

https://www.cnet.com/tech/gaming/best-graphics-card/

GT 1030?
RTX 3050?
RTX 4070?

beyond baffling.

The only thing the RTX 4070 is good value for, is a space-limited ITX build.

I see there is no shortage of affiliate links though on that ‘article’.

Well, I had never heard of anyone using CNET to make a decision for a gaming system. Now I have. Thank you for sharing it, but I don’t see any reason to care for their awards; their reviews look pitiable compared to anything I’ve read on any other technology site in the last 5 years.

No mention of Intel GPUs anywhere, even though they’ve almost caught up with AMD after just one year. (AMD took 25 years and has achieved 9% of the market today; Intel are at 6% of the market, after 1 year).

I would recommend browsing /r/hardware: a technology subreddit for computer hardware news, reviews and discussion. which links out to about 10 interesting, detailed and reputable gaming hardware review sites as well as another 10 or so youtube-based reviewers.

Instead of 3-4 sentences of text on each card, at any reputable site, you’ll instead find about 20-30 pages of detailed commentary and testing per card, covering real-world and simulated workloads, as well as issues such as heat, fan design, dimensions/shaping (as it affects suitability for builds), quality/reliability of warranty and aftersales, power consumption, power spiking, driver issues in various top games, total costing including energy efficient (power isn’t free) etc.

Crucially, the right card to choose depends on what games you tend to play as much as anything else.

2 Likes

Also, re: ‘nvidia’s growth’ that everyone on the thread is talking about vaguely without reference to any actual data (inexplicably).

Not only do they look terrible 2022 full year vs 2021 full year.

They also look terrible comparing the most recent quarter to the same quarter a year ago, DESPITE all this ai hype for the last 6 months and rushed spending by customers.

https://finance.yahoo.com/quote/NVDA/financials?p=NVDA

total revenue 4/29/2022: 8.3bn
total revenue 4/29/2023: 7.2bn

growth?

gross profit 4/29/2022: 5.4 bn
gross profit 4/29/2023: 4.6 bn

growth?

hmm, yes, I can totally see why a price of sales 35x (!!!) on Friday would be justified or a PER of 250x, that really is quite an amazing ‘growth’ story whether you look at the full-year or quarter vs quarter.

even if they were growing - and they’ve not been growing - there is no amount of proposed growth that would justify a P/S ratio of 35x on a mature global company that has saturated its markets.

still, maybe there’s a case for dividend investors? at their current level of dividend payment, today’s share price will be repaid to the faithful dividend investor in about 800 years time, after taxes.

4 Likes

Why do I think NVDA growth has been doing well? Because of my personal experience. I took a position in the stock some 5 years ago. Five years ago the revenues for the trailing four quarters were $9.6 billion. Now the last four quarters totaled $25.8 billion, somewhere between a double and a triple. That is a growth rate north of 20% – maybe not Saul territory, but certainly “doing well”.

DB2

3 Likes

The company’s Mcap should be far too big to use Saul’s stuff for comparison. That is part of the Mcap problem. There can be a downside.

Like when Msft and then Cisco hit the $1 tr Mcap size. At that time there could only be a downside.

2 Likes

Probably true. At the same time, if the growth rate slows from >20% to 15% I’ll still be a happy long-term camper. One has to live with semis being cyclical.

DB2

Do NOT take this as direct advice.

Just a thought to think over here and there in life.

No one ever went broke taking a profit.

Except that Cisco never hit a trillion dollar Mcap and Msft is well over a 2 trillion now.

Andy

1 Like

You are right it was literally 23 years ago and the headlines were different. The actual issue was CSCO took a turn at being the most valuable company on earth. As brief as that was.

At its peak in 2000, Cisco stock traded above $79 a share, for a market cap of $546 billion – surpassing Microsoft as the world’s most valuable company and inspiring estimates that it could surpass a $1 trillion valuation.Sep 23, 2016

1 Like

The thing about ‘growth’ companies is they usually ‘keep growing’ and nvidia clearly isn’t in the last year. (2021-2022 FY) or (2022Q1 vs 2023Q1).

But if you want to pick totally arbitrary points in time and use that to totally redefine the concept of growth… well on the same sort of timescale, fertiliser companies (2018->2022) look like growth companies too. Why not?

But in fact, NVIDIA has been around since the 90s, there’s nothing new about their tech or how it’s being used compared to years ago, there is just a huge hype bubble right now. They are quite cyclical, with only a couple of competitors. Agricultural fertiliser companies are a good point of reference, in that sense…

NVIDIA lucked out to a ridiculous degree in 2020/2021 from crypto madness and from covid lockdowns causing many to build new gaming PCs. They may luck out for another few quarters from this AI hype too.

After that? The datacenter budget for the next few years will be spent, and there will be a downturn in both consumer GPU and datacenter GPU/TPU spend - plus new TPU competitors will be along, promptly, and perhaps also a recession.

2 Likes

Curious did NVDA luck out in your opinion from Crypto mining?

That cycle will start up possibly by the end of this year into another bull run.

I do get from what you are saying this time AMD and Intel would eat much of that lunch since crypto is not gaming.

1 Like

That is why I asked upthread about your opinion for the next 3-5 years. Thank you.

I may trim some as Leap suggested, but as I get older I find myself more like my late father-in-law who basically never sold anything.

DB2

2 Likes

Hey DrBob well done. You did a great job and while Cyclical I do think NVDA will have a great run for the next few years. But look at this chart.

That purple line is the 30 MA see how it has been bouncing off of it since the first of the year? Some people would expect it to come down and hit it again. A reversion to the mean. But this is investing and anything could happen tomorrow right?

Andy

1 Like

Andy,

Poor bob’s BP. You are adding to the complications here with TA. The 30 MA? Really?

Seriously I’d look at the mcap and guess if it is worthwhile. It certainly is not an entry? Probably not no certainty.

DrBob isn’t talking about an entry Leap he is talking about selling. My problem is when I trim I always trim to soon. As far as Mcap goes, that is meaningless to a company that is projecting 100 percent Data center growth next quarter. But DrBob has a first class problem. One that might just be solved by sitting on his hands.

Andy

2 Likes

Andy Andy Andy,

There is no such thing as too soon. There are only opportunity costs.

Meaning if you have cash at the right time for your next investment that matters.

Like going hunting for turkey. You have to track’m.

Would the CEO attempting a Crysis joke and AI karaoke be considered a sell signal?

1 Like

That is why I asked upthread about your opinion for the next 3-5 years. Thank you. I may trim some as Leap suggested, but as I get older I find myself more like my late father-in-law who basically never sold anything.

You may want to ask investors who trimmed $NVDA last time it reached the mid $300s, not long ago, how they felt in 2022 when the price dropped to $110.

Even at $110, $NVDA’s price/earnings ratio, price/sales ratio etc stood out as abnormal among other SP500 companies.

$NVDA @ $100ish would be an expensive but interesting play.

$NVDA @ $300-400ish may be inviting considerable suffering upon oneself.

6 Likes

Curious about your opinion on this. I am strongly considering using UE to make my game. I hate coding in C++.

The other opinion I need from you, is there a tutorial on making a first game in UE that you recommend?

I would still use Playfab. I do not know if I would still use Microsoft Developer. I doubt it.

I am guessing UE is all in one compared to Unity. That is until I get to working with Playfab.

TIA

1 Like

Best thing to do is to ask on: