NVDA Q2 Earnings

13.51B revenue, 101% YoY and 88% from Q1, in Q1 guide was 11B, so they beat that by 23%
Data center revenue 10.32B, 171% YoY and 141% QoQ

Outlook for Q3 is 16B, which is a 19% QoQ guide

All of the chip stocks are up AH, NVDA 9%, SMCI 7%, AMD/TSM up 3%, even AEHR which is unrelated to AI is up 3%

Will be interesting to see what they say about supply constraints on the call

I have no position in NVDA, but a substantial one in SMCI

Edit: No idea how long the datacenter wave will last, Bear has pointed this out many times in the SMCI discussions - but seems like currently it’s still just the beginning


Just noting that AEHR was up almost 6% at the close, before the additional padding AH. I’m sure NVDA’s results didn’t hurt, since they sit adjacent to the chip market, but AEHR also had an investor day yesterday. Nothing public–just one-on-one meetings with investors. I took the spike today as reflecting success in their meetings yesterday as well as a bit of NVDA joy after hours.



I thought supply was the greatest risk heading into earnings, having seen SMCI been hit by the same issue. I was glad they addressed it to some extent on the call.

"Our supply partners have been exceptional in ramping capacity to support our needs. Our data center supply chain, including HGX, with 35,000 parts and highly complex networking, has been built up over the past decade. We have also developed and qualified additional capacity and suppliers for key steps in the manufacturing process such as co-op packaging.

We expect supply to increase each quarter through next year.“”

However when pressed to give specific numbers on supply growth, they were unwilling to commit.

I think the current data center wave will last a long-ish time of 4-5 years (FWIW, Jensen Huang says 10years.) As we know, this is a cyclical industry. It seems we are right at the start of the S-curve acceleration. I don’t expect hypergrowth all through the cycle, but it would not be unreasonable to expect hypergrowth for the next 18-24mo.

Moreover, Nvidia has shown over decades that it is more than a simple widget co or a 1-trick pony. Many investors view it as a LTBH company and I would agree with that.

Back to the current Data Center wave. If Data Center rev grows 20% QoQ for the next 2 quarters, their revenue for this year would be roughly $41b.

Jensen Huang has said data center spend is ~$250b annually. Not all of it goes to Nvidia but I’m sure Nvidia will take a larger share of that $250b going forward. At $41b, Nvidia will have 16% market share by the end of this year.

Assuming Data Center rev grows 50% annually in FY2025 and 2026, it would be $92b in 2.5 years’ time, or 37% market share. I don’t think that’s an unreasonable set of numbers to work with.


“Moreover, Nvidia has shown over decades that it is more than a simple widget co or a 1-trick pony. Many investors view it as a LTBH company and I would agree with that.”

Nvidia is at the leading edge of compute accelerators. But also, they have established a programming paradigm (CUDA) for their accelerators that has been adopted by large blocks of the industry. That is a moat. Articles say others are chipping away, but for now, they are the leader in that branch of compute acceleration for AI.


Not to get overexcited, but it might be even better than that. The guide for next quarter, with any kind of beat even close to this quarter’s, could mean close to $15b in datacenter. Even if it’s a little under that, you’re looking at an almost $60b run rate (~$70b for NVDA total revenue since gaming contributes a bit too). Assuming $15b isn’t topped out, but still ramping up…well, maybe we’re talking about close to $18-20b in Q4 this year? Then even if that’s pretty close to topped out, datacenter in f2025 could look like:

Add in gaming and NVDA could flirt with $100b in revenue next year. And I’m not the only one who thinks so:

Kind of amazing that the low estimate is 47b and the high is 108b of revenue…maybe some analysts haven’t updated their numbers yet. But it seems to me that the average estimate of ~71b will likely be beaten, and as I said, maybe they can leap near 100b. That would mean ~50b of profit next year, so a $1.5 - 2.0 trillion dollar valuation would be pretty reasonable (a PE of 30 or 40).

But then I worry they’ll have a Zoom-like problem. Maybe after f2025 they level off around 100b in revenue, 50b in profit. I would imagine at this point a PE of 30-40 would seem kinda high. Look at ZM with its PE currently around 15. But that’s what happens when growth all but stops.

You might reasonably say, but Bear, why would NVDA stop growing? Well, because $100b is a ton of revenue and a ton of product. Is it possible demand goes even higher in future years? Of course. But it’s also possible it doesn’t…and at some point even drops back to a more historically normal level. Data (and datacenters) will grow year after year, driving NVDA’s revenue higher…except in extreme times when it will spike and then fall back – and I think we’re clearly in a spike now.

How high the spike will take NVDA’s business and how far back it will settle…that’s what the market is trying to figure out now. NVDA’s mkt cap is now ~$1.1 – no one is sleeping on this stock. Therefore it also seems unlikely we’ll see a much lower price any time soon.

So I decided to jump in with a small position. It’s amazing to me that NVDA has just more than doubled their datacenter revenue in a quarter, and might be able to double it again and more. Maybe they won’t and the stock will go sideways. But if they do, the market will likely believe their growth has some durability. At some point the market will even overshoot it…and this is just my gut, but I don’t think that has happened yet.

The overall market yesterday and today has kept the share price at bay, but I don’t expect NVDA to fall much after a quarter like they just put up. My small position will spur me to watch carefully, and if for some reason we do see more of a bargain, I’ll be ready to add.

The quarter definitely proved that the story is playing out. Now we just have to see how long the story can last.

If either of you are close (I like the chances that you are), then this company will put on a real show.



@PaulWBryant I’m bullish also. NVDA is supply constrained and they say that they have great visibility into demand.I believe them. I think the peak in this revenue spike will not come within the next 4 quarters. I think at a minimum they see 20% QoQ revenue growth in datacenter for the next few quarters. They might even do 25% or more QoQ growth in some quarters if supply can be ramped more quickly. I started buying NVDA at the end of July and I continued buying through this week…now a 6% position and I may take it as high as 10%. So yes, I agree that to me the risks to the business growth for the next few quarters seem low.



There was a pretty good Motley fool article today that discusses some of the important points for Nvidia’s future.

  1. How big is the AI data center market?

“While Bloomberg Intelligence believes the market will grow at a 20% average growth rate, reaching $133 billion by 2032, others believe the growth will be much faster and bigger. Recently, the CEO of Foxconn parent Hon Hai Precision Industry (HNHPF -0.45%) claimed the AI server market would grow to $150 billion by 2027, up from $30 billion this year.”

Jensen Huang told The Wall Street Journal after earnings, “We’re not shipping close to demand.”

  1. How long can Nvidia maintain its Moat?

AMD may be able to gain market share with its MI300 accelerator. Also . . “all of the major cloud providers are currently designing their own AI-focused accelerators, with each cloud giant claiming their chip has lower computing costs than Nvidia’s.”

" . . . it’s also possible Nvidia can maintain its moat for quite a while. Nvidia has been developing its CUDA programming platform for nearly 20 years, which allows developers to program graphics chips for the purposes of accelerated computing. That extensive software tied to its hardware will make it difficult for others to nudge their way in, because once a critical mass of developers learns a programming language, it’s much easier for enterprises to adopt that platform rather than retrain their developers and engineers on a new system. Of note, AMD just announced an AI software acquisition yesterday, as it looks to quickly make headway on its own software efforts. Meanwhile, Nvidia’s chips have an advantage over cloud giants’ accelerators, as they can be used across different cloud platforms."

Full Article:

Why Nvidia Retreated on Friday | The Motley Fool


Great post, Bear. And while Nvidia may eventually have a TAM problem, I see the situation as way different than what Zoom was/is. Zoom just expanded super quickly into their TAM, as ramping software (even with server-side requirements) is way easier than hardware. But in Zoom’s case there were already strong incumbents in the web meeting space, including Cisco and Microsoft, both of whom have bundling power and whose products didn’t suck.

For Nvidia, however, there are practically no incumbents other than previous sales of Nvidia chips. And the TAM is greatly expanding as the use cases for AI are greatly expanding. While the number of people on video conferencing peaked during Covid, we don’t yet know how far into things AI will go. For instance, it’s not hard to see AI going into CANbus readers for automotive debugging, then extending into home appliances that could even be connected and so when the repairperson comes to your home, he knows exactly what’s wrong and what needs to be done. Phones have already shown that built-in intelligence beats megapixels and even optics in many cases to produce better photos than professional gear without that intelligence.

I actually feel a bit silly writing some potential use cases for AI, because anything I write today will be looked at in the future as just some surface scratching. Legacy datacenters are going to be replaced with AI datacenters as well as new AI datacenters that will continue to be created. And I feel that as fast as things are moving today, as the software around AI gets better and is applied to more use cases, the need for more AI hardware can only increase for the foreseeable future. Heck, Tesla has developed its own “AI chip” and yet they’re buying as many Nvidia GPUs as they can get.

A few weeks before earnings, Nvidia announced the GH200, whose performance is 3X-3.5X the existing GH100 chip that they can’t make enough of. The GH200 won’t be available in systems until Q2 next year.

With Nvidia what we have is the dominant player in an expanding TAM that continues to innovate, and even has a moat with its CUDA software. AI isn’t some fad but is a sea change in how previously “solved” problems will be better solved in the future.


Nvidia is also active on the partnerships front. The CEO, Jensen Huang, was on the stage with VMware’s CEO at VMware Explore last week. They announced VMware AI Foundation, with Nvidia being the compute foundation for VMware multi-cloud (VMware is at every datacenter). Apparently both companies were working on this for a couple of years, it was a big announcement.


People on this board give a lot of weight to whether the company meets or beats their numbers for the quarter, so I would think it would give a lot of comfort to know that Nvidia has excellent visibility simply because all their chips and systems shipments are so heavily oversubscribed that they are on allocation.

Allocation in the semiconductor industry means that every buyer who has clout puts in their order for how many units they want, and then the manufacturer tells them what portion of their order will be filled. As in, “You can’t have as many as that, but here’s what we can manage to give you.”

And if you’re a buyer who has no clout, you’re going to hope to be able to buy on the grey market. “Psst, wanna buy some GH100s? I’ve got a couple in the back of my truck for the right price…”

Of course, if every unit you build is allocated and you can see how much your best customers want above and beyond that, you can get a pretty good sense of how strong your market demand is at this instant. Naturally, it’s possible that buyers will order more than they need, due to the scarcity, but management has seen this movie before so they know how to account for that, especially at this stage of the demand curve.

It will get trickier at some point, as they get closer to actually satisfying the real demand – but we’re not there yet.



Just to put an exclamation point on Nvidia:

  • In the Q1 call, they reported quarterly revenue of $7.19B, which, while up 19% from the previous (Q4) quarter, was down 13% from last year’s Q1.

  • And so analysts were expecting about the same $7.2B revenue for Q2, but the guidance given then was for $11B. That’s over 50% higher!

  • Turned out that humogously raised guidance was sandbagged, as Q2’s revenue actually came in at $13.5B! That’s 23% higher than guidance and 88% higher than previous expectations!

  • Guidance for Q3 is now at $16B. Is that also sandbagged? As posted above, management knows their order book and production capabilities. Demand continues to outstrip supply, but supply is still growing.

  • Nvidia isn’t sitting still on the product development side, either, also just announcing the GH200 chip, which will ship next year. And both Amazon and Microsoft just opened up cloud instances of Nvidia H100 servers. So now companies can try out or ramp up AI work on Nvidia without having to buy hardware themselves. From VMWare to HuggingFace to Snowflake to Accenture to ServiceNow, Nvidia has partnerships everywhere. And yeah, CUDA remains a moat even if an AMD could come out with a competing chip that somehow was faster than Nvidia’s new announcements.

  • And all that’s before considering Gaming and Visualization (up 22% and 24% YoY). Interestingly, the previously high expectations on the automotive segment aren’t panning out, probably because autonomous vehicles aren’t hitting the market (Tesla uses its own AI chips in its vehicles).

Nvidia has gone from gaming (growing, but not great) to crypto (boom then bust) and now to AI. I believe AI has legs and no-one does AI without Nvidia in the loop.