Why 200-300% growth isn't always a good thing

One of (IMHO) @SaulR80683’s most prescient moments was predicting with remarkable accuracy when Zoom was about to run into a growth slowdown buzzsaw. When did he do it? December 2020. Right after they reported a quarter where revenue was up 367% YoY.

Here’s a quote from Saul:

At the end of Nov 2020 Saul had a ~22% Zoom position. He reduced it to 3.6% in December while it was still well over $300/share. Other than brief spikes in Feb and Jun 2021, it never saw $400/share again, and now sits at about $63.

I see a lot of posts insinuating that Nvidia is a no-brainer and that some are ignoring its 200%+ growth. I disagree. No one is ignoring it! We’re all watching very closely. It’s not obvious to me that Nvidia is at the point Zoom was after Q3 results came out back in late 2020. But it’s not obvious to me that we aren’t, either.

I’d love responses, but I suggest we focus on the facts we know. How much are Nvidia’s major customers spending per year? Have they said anything about plans to sustain that spend for multiple years? Increase it? Are those plans contingent on the customers’ AI projects and how much revenue they see from them?

We can all speculate, but maybe it’s worth trying to get our collective heads around some numbers?



In order for Nvidia to keep growing like this they will need to keep innovating so that customers keep upgrading. They seem be doing that right now, in fact they are on a 2 times per year release cycle which is impressive and something their competition can’t do yet. However, I think it’s inevitable that at some point their chips get better but at smaller margins and customers will upgrade less often and growth will slow. I don’t know when that will happen but I think it’s something to watch for. Apple is dealing with the same issue with the iPhone.


Saul gave a reasonable rationale on his Zoom sell that they had conquered nearly the entire enterprise market for video calling and their TAM was limited. Additionally he’s mentioned back then and recently that sequential growth slowed significantly.

Where are you seeing similar evidence of Nvidia slowing or what is the rationale for a slow down? There is still strong sequential growth in their results, and if anything it sounds like their TAM is expanding.

At their last number of conferences they have listed over 100 significant partners who are also demo’ing their incredible use cases for their technology. I’m not really looking to play a guessing game on when the market for AI will slow, I’d like to wait till there is something wrong with the results before selling. Jensen is saying he sees at least a four to five year build out of accelerated computing, and he’s probably the best person to make a prediction like this given all the contacts he has.

I’d love responses, but I suggest we focus on the facts we know . How much are Nvidia’s major customers spending per year? Have they said anything about plans to sustain that spend for multiple years? Increase it? Are those plans contingent on the customers’ AI projects and how much revenue they see from them?

I’ve pointed out multiple times Meta is one of the customers that is increasing spending and have put a 100B figure on it for the next number of years. Their last conference call was almost entirely focused on AI and the build out. They are seeing tremendous ROI from increased ad spend and reduced content moderation. They are in the business of building competing models having released Llama 3, training Llama 4, and planning for Llama 5. There’s tons of other companies getting a huge ROI from their AI build out as well. Meta is just one example.

We can all speculate, but maybe it’s worth trying to get our collective heads around some numbers?

Why no mention that net income grew 650% last quarter? The last four quarters have seen net income grow 853% → 1274% → 761% → 650%. Seems like this should be the most important metric to look at, what the company is making for its shareholders.

Update, that Zoom was also growing bottom line after taking another look. The quarter in question saw net income go from 2M → 199M, technically a 8697% increase. Would like to point out the scale of these companies is completely different, Nvidia last Q net income is going from 2.04B → 14.88B year over year. Still trying to understand how a mid cap SaaS benefitting from covid/remote work is the same situation as a megacap semiconductor rebuilding data centers. This doesn’t seem like an apples to apples comparison to begin with.


That is completely the opposite of the truth. Zoom’s net income growth was actually more impressive than Nvidia’s. Starting in the quarter ending Jul 2020, Zoom grew net income 3,264% → 8,891% → 1,724% → 741% → 70.6%.

It’s likely more helpful to look at sequential increases. But I respectfully suggest you’re probably looking at the wrong thing, anyway. What might actually be a leading indicator?



The bottom line as I see it, which is also what I brought up last night in another thread, is about the risk reward profile of NVDA from here relative to that of its known major customers.

Investing now in NVDA’s main customers like TSLA, META, AMZN, GOOG, MSFT is investing in those businesses’ optionality, investing in something in which there is “big” risk because it is not there yet. But unlike the case of many small software and biomed companies trying to revolutionaize one or another area through AI, these massive NVDA customers are also the world’ top owners of data, human and financial capital. I would throw AAPL in the mix for they are surely not sitting idle. The risk of investing in these companies is minimal to begin with. And then there is the reward similar to what AWS was for AMZN shareholders.

Investing now in NVDA itself is investing in a booming and well known revenue stream to which everyone and their cats already have exposure, including myself and my kids. As NVDA’s own history showed on 2 occasions already, these things always resolve through a massive drawdown even if the company remains world-class and its stock roars again years later.

The exception would be quickly found optionality, something for which we are not yet paying when exposed to NVDA. Can a specific chip like Jetson offer enough? Can a platform like ISaac offer enough? Can NVDA reasonably be a robot-builder in addition to robot-enabler?

Zoom Phone was supposed to be the next big leg of Zoom expansion. Saul wondered how exactly that would move the needle at the 2020 scale of Zoom revenue. That just did not work out. In my case, while my employer is a massive Zoom customer, we went Cisco for phones and then dumped phones for most employees altogether.

At the end of the day, the rate of growth of a specific revenue stream has to slow and that’s all it takes to question the investment’s intermediate term prospects absent massive enough new revenue streams to move the needle for a 3T company.

In short, what may NVDA have up its sleeve that is not already accounted for? That’s what I need to know to potentially like NVDA here better than its Mag 7 peers.


Fair enough, it’s been awhile since I looked at Zoom’s prior financials and it did have those huge percentage increases in the bottom line. I would caveat that it was comparing to previous quarters year over year which were near break even so the percentage increases look huge.

Bear, respectfully I’ve seen you mention Supermicro as a gamble, not a serious company, as speculative, and then said in my April write up that you cannot understand how this could be a top confidence position. You mentioned that you looked at 2026 analyst estimates and then said it looks like it could be overvalued. When the P/E of this company was at 10, you were calling it a gamble, but it’s been roughly a 7-bagger in a year.

My approach is to follow the numbers as Saul does, and both Nvidia and Supermicro have proven they are huge growers and it’s why their stocks have rallied so much.

I’m not finding much value in trying to forecast valuations into 2026. When I recently went back and looked at that thread of people saying how undervalued SaaS was in January 2022 that someone linked, many people were giving forecasts about what the cash flows of the SaaS companies they owned looked like in 2029. The only issue is that you cannot predict technological disruption like what has happened in 2023 and 2024 with AI.


Here are last year’s numbers:

Of course, this is out of date by half a year.

And then we have this “topping indication,” which is literally the CEO signing a woman’s top:

Not numbers, but not a good look.

Back to the first slide, we see Meta at 150K H100s. Zuckerberg said he expects to have 350K H100s by the end of this year. That’s 200K more purchases this year, and growth of 33%.

AMD’s CEO, Lisa Su, has put a $400B TAM on AI in 2027 - just 3 years from now.

CEO Huang said in the last ER call that the build-out of AI data centers is only about 5% complete. That leaves a lot of room for growth.

All these indications are far different than ZOOM hitting its TAM of Covid lock downed people using their product.

And then when we look at how AI works, we see that today training and re-training is required for most development iterations. That is developers tweak the code and configuration, then have to run all the data through the new system to get a result and test that to see if/how much it is better. The longer this training cycle takes, the slower the development. Elon Musk has attributed Tesla’s recent advances in autonomy to its build out of AI training compute enabling shorter cycles. I suspect OpenAI feels the same training cycle delay pain on ChatGPT.

As for the discussion on small versus large companies, this is where AIaaS comes into play. Instead of installing your own on premise AI servers, use MS Azure or Amazon’s AWS AI compute. So, the big cloud names will keep on buying to be able to handle the demand.

As I’ve said before, I expect the YoY percentages to decline rapidly starting this next quarter, we Nvidia laps the response to the “ChatGPT moment.” But, growth is still there and will still be double digits QoQ.

On another note, the US government is looking into anti-trust for Nvidia, among other companies. What’s funny is that the real issue is that no-one except Nvidia recognized what was coming for AI soon enough.


It hasn’t been much discussed here, possibly because it rises to the level of state secret, but more than several countries have indicated that they consider AI capable data centers to be a matter of national security. The resources of Japan, or India, or France, Germany…etc far outstrip any company, even the largest cloud providers. I suspect the data center build out is going to last longer than we think. Also, data centers require constant refreshing, and by all accounts Nvidia has a 3-5 year lead (possibly growing) in tech. Which turns those data center refreshes into a form of recurring revenue streams. At least that’s how it looks to this amateur.


I would say you are most likely right but not because of the demand going down but because the production of the chips will flatten out. Unlike Zoom NVDA could sell everything they make for years and still not catch up to the demand. Every new chip they create will create another must have chip and for all the companies buying the chips, they can’t stop because they will fall behind.

They thought that the training would slow down because nobody was going to go beyond a LLM training set because the cost benefit was so low, but now the training modules are getting better only because of the competition. How do you measure all of this? It’s impossible. There are to many variables. The only thing we can do is watch and see when the large cloud companies start cutting their Capex down and Nvda’s Revenue starts coming down. It’s going to be very hard to time.



Wow - a ton here to unpick but in the spirit of crowdsourcing ideas around numbers to focus on here are my thoughts on points shared:-

1) Zoom and its relevance as an Analog (plus CISCO)
IIRC the canary in the coal mine for Saul was the sequential growth story less the TAM story. The problem with video calling was we didn’t know what the TAM was and what made it completely irrelevant in the end was the fact that what emerged as the market leader was Microsoft Teams which effectively stole the market without touching the TAM as it turned out it had Teams bundled in with all enterprise Office 365 all along. (Something they may have to unbundle due to regulators now). Back then calculations of TAM were built around Webex, BlueJeans and all those other video calling players mostly.

On the other hand we do know pretty clearly what the current CPU/GPU TAM is worth - we just don’t know what it could be worth but we should be able to see the growth in orders being announced and we will definitely see the sequential growth as it comes through (or not). In this respect I see the Nvidia situation a little more like Cisco building the internet in 2000. We didn’t know how many routers it would take until bang it was complete and no more orders came through.

2) Order Numbers
Either AMD and Nvidia or end customers do announce orders so that might be worth tracking (and the growth in orders specifically). That should be an indicator. What we will have to pay very close attention to as suggested in a couple of points in this thread are to what degree linearity of orders continues:

  • As compute processing demand moves from training to inference
  • As processor upgrade releases happen

3) Fabrication, Manufacturing and Supply Chain Capacity
One other source of numbers to watch and it isn’t perfect otherwise the semiconductor business wouldn’t be cyclical, is to what degree orders are being pre-booked for how long with the chip manufacturers (TSMC and Global Foundries etc) and what capacity driven increases in CAPEX spend are being made to satisfy demand.

Right now I believe TSMC is booked out 2 years in advance which is something and has to be worth monitoring.

4) End Use Case Demand
We should keep a look out for high use case applications of AI compute. Clearly FSD is one, Metaverse Augmented Reality type situations another, Financial Services and Supply Chain Management are also high compute opportunities as well as Healthcare (BioPharma drug discovery and research as well as patient data analytics). Potentially cybersecurity could be another.

Tesla has helped quantify their compute capacity required for global FSD requirements. We should look out for other quantifications as they become clear.

5) Sovereign Data Centers
Ok this one is interesting. This is a market inefficiency that is distorting (and elevating) the TAM. I don’t know how much inefficiency is being added by country by country build out of data centers on a sovereign basis but it has to be inefficient vs building total capacity out of simply a few centers (US and China) or lowest cost to operate locations. This is being driven by 2 factors - i) data sovereignty policy and regulations where data is not allowed to cross borders (China is the leader in this one but many countries are catching up) and ii) national industrial policy incentivising inward foreign direct investment. If either of these relinquish I would think that would hit that opportunity. In addition once we have exhausted all countries with data sovereignty regulations and industrial policy incentives then that’s the end of that TAM inflation driver.

6) Competition and Profitability
Lastly and this is less of a top line TAM issue but more of a competitive forces and structural profitability issue is to what degree i) competition comes in whether it be a shift to ARM processors or other good enough options cost/performance wise that can enter the market and ii) to what degree saturation in terms of manufacturing and supply can handle the demand at lower and lower price points and profitability margin (e.g. what AMD and ARM have done to Intel in the CPU side).

In short I don’t have data sources/numbers to share but some thoughts on what could be meaningful indicators.

BTW - I’m not from the semicon industry and there are folks on this board who have spent whole careers in the chip industry so this is all IMHO.



Huang keeps emphasizing that when talking about Nvidiia we should be talking about HPC not just AI and that AI is just a SINGLE application of HPC.

HPC applications include:

  1. AI
  2. graphics rendering
  3. blockchain
  4. simulations of complex systems (weather, power grid, social networks, molecule interactions etc.)
  5. validating quantum algorithms
  6. cryptography
  7. TBD

#5 is interesting because advances in HPC keep pushing the definition of “quantum superiority” further and further off into the distance; the better HPC gets, the better quantum has to be before it’s “superior”.

Remember that blockchain was an UNANTICIPATED needle-moving category of HPC that added a huge revenue stream for $NVDA just as the demand for graphics processing had leveled out. And that ChatGPT was an UNANTICIPATED needle-moving sub-category of AI that also led to new applications such as improved marketing algorithms (…e.g. for $META and $APP) and improved automation of social media policy enforcement.

So possibilities for future unanticipated needle-movers for $NVDA include #7, above and new subcategories of #1…6 (…and/or new breakout successes in #1…6).

Also: regarding HPC, we will see a dynamic of “there is no going back.”
For instance, bad actors will find ways to thwart AI-driven social media monitoring. The solution? MORE AI. $META will be buying GPUs for the rest of its existence, and for more-and-more reasons.


Hi Bear, I have to agree with the others that comparing Zoom with Nvidia in this case is not at all an apples-to-apples comparison.

Zoom was selling a simple consumer product, and they were selling consumer seats. Everyone who wanted it went out and bought a subscription at once because they were stuck at home because of Covid. After three quarters there was no-one new left to buy it. And there wasn’t any next generation product that you had to upgrade to or you’d be left behind.

Note that this wasn’t as bad as Peloton, whose stock price has gone from $118 to $3 something, because people kept their Zoom subscriptions even after Covid, but they went outside to exercise instead of exercising at home.

While Zoom was selling a simple consumer product, where they were selling consumer seats, and there wasn’t any next generation product that you had to upgrade to or you’d be left behind, Nvidia is selling the future of the internet and computing, and instead of selling to the public they are selling to the most advanced tech companies, who really feel the need to buy what Nvidia is selling, and Nvidia is selling a new improved version every six months. And Nvidia can’t make enough chips fast enough to keep up with demand. Long waiting lists. No real comparison with Zoom.

Granted, Nvidia will slow down yoy growth starting next quarter when they lap the first quarter of massive growth, but as far as anyone knows, they will be growing fairly rapidly for the foreseeable future. How much, and until when, remains to be seen.

For full disclosure, as Nvidia keeps going up faster than anything else, or while others are going down, it has now grown into a 12% position for me.



From Mizzmonika’s rather brilliant post on the Nvidia thread:


"I think what you are missing is several points:

"1. We are at the BEGINNING of the AI revolution. Year 1.5. Whatever you worry about happening, is still far away. Everything we hear is that these chips are consumed as fast as they come out of the factory, and the early ones will be obsolete in a few years.

"2. Huang is WICKED SMART. LIke IMHO light years ahead of every other CEO on the planet. He saw this coming 10 years ago. He sees the need not to be JUST in chips. CUDA is not just to keep people locked into his chips. His little “inference modules” or whatever he calls them - snippets that enable you to do AI functionality that you can embed into your offering - is brilliant. Of course open source may compete but he will always be one step ahead. He is already in markets yet to rise. He is that kind of CEO…

…Bottom line: I count among my biggest mistakes not investing in ZOOM when it was obvious it was rising because I saw past its adoption and then thought, what’s next? Where are the adjacencies? I was right long term but I left tons and tons of money off the table short term.

"To worry about what will happen N years out and leave this massive shift off the table, well, that’s your business. But don’t rule out Huang continuing to skate well ahead of the puck for several years to come, and that’s all I need to stay in for now."


I agree, Zoom is different for a number of reasons, mainly because the supply was readily available to meet the demand, amazingly, even though the demand went parabolic. (Although the question of “when” is similar and as @mizzmonika says, she was eventually right on Zoom…I will eventually be "right on Nvidia…but it might double again before I am!)

I wish I had more data on Cisco as @anthonyms mentions. I do think that’s a better analog. My common sense thought is that when you become the 1st or 2nd biggest company in the world, demand must be pretty close to peaking, because how much more can all the other companies afford to spend on your products? But that’s just my gut.

It would be great to know more about TSM’s ability to provide Nvidia with more chips, but I think it’s probably more useful to continue to focus on the demand side. There’s an article on the Fool’s own site (admittedly by an Nvida bear, AFAICT) that suggests demand might peak this year. Here’s a snippet:

The problem for the infrastructure backbone of the AI movement is that its four largest customers by revenue are actively developing AI chips of their own to complement the Nvidia GPUs they’ve been purchasing for their high-compute data centers…

The other issue is that even if Microsoft, Meta, Amazon, and Alphabet continue to purchase Nvidia’s GPUs, we’re more than likely witnessing a peak in orders in 2024.

For example, Meta spent almost $27.3 billion on property and equipment last year. The 350,000 H100 GPUs the company is purchasing from Nvidia will come at a cost of up to $10.5 billion. That’s a pretty sizable percentage of Meta’s annual capital expenditures coming from a single purchase, which almost certainly isn’t going to be duplicated in future years. CEO Mark Zuckerberg has noted that his company will have “around 600,000 H100 equivalents of compute if you include other GPUs” by the end of 2024. Presumably, Meta’s chief is intimating the use of Meta’s in-house-developed AI chips along with Nvidia’s H100 GPUs.

The point being that Nvidia’s top customers are either moving away from its GPU technology, or are highly unlikely to sustain their existing order activity beyond the current year.

I have no idea if those numbers have changed in recent weeks, but they definitely seem worth watching.

I guess that’s what I’ve tried to do. I have noticed that not only revenue growth rate, but revenue growth in actual dollars, has actually been declining sequentially.

I’m not sure where all this leaves us…I just find it interesting to ask if sequential double digit revenue growth going forward will be enough. It wasn’t for Zoom, and it only stayed in double digits for a couple quarters after it started declining, but as I’ve said, I concede that Zoom isn’t the best analog.



Bear I think you might want to check your numbers again. Those Revenue numbers appear to be last years numbers maybe? Because they are no where close to where we are at now. Revenue and income are both climbing sequentially although the rate of Revenue growth is starting to come down a hair .




As I said in the tweet, these are the sequential increases.

Your chart shows the YoY change…let’s examine it sequentially.

Q2 f2024: +$6.3b was +88% sequentially!
Q3 f2024: $4.6b was +34% seq.
Q4 f2024: $4.0b was +22% seq.
Q1 f2025: $3.9b was +18% seq.

If the trend continues and the increase is less than 3.9b next quarter, sequential increase will be under 15%. Still good, but good enough? We’ll see.



What happens though when a combination of GPU-driven cost savings and GPU-driven profit flips math like “percentage of annual capital expenditures” on its head?

Huang argues that GPUs can deliver 100x compute at the expense of only 3x power costs; he calls this “CEO math” and says (somewhat tongue-in-cheek) “The more you buy, the more you save.”

And as as $APP CEO (Adam Foroughi) has pointed out, if a company is getting two dollars in profit for every one dollar of marketing spend because AI, there is NO LIMIT to how much will be spent on the AI (…until you have run out of Customers, I imagine).

Like $ZM, $NVDA has previously “run out of customers”: sales of PC graphics cards and sales of chips used for bitcoin mining eventually leveled out. When will they run out of customers for LLM-driven applications? Who can say.

Per my list above, we may see future cycles of demand driven by new categories, new sub-categories, and new applications of HPC. But at what point in the future will ALL requirements for ALL HPC be satisfied? My guess: not this year, the next year or the one after that.

And how many companies correctly predicted the importance of building out an entire HPC ecosystem (networking, software etc. not just hardware) and have delivered the entire ecosystem?

Only one.


A number of people have attempted to use Cisco as the predictor for Nvidia’s eventual decline, but as I pointed out in the Nvidia thread, there are significant differences between them. Cisco CEO John Chambers kept his company in its lane and didn’t branch out to new things like SDN (Software Defined Networking), which led to engineers leaving to form Arista, as one example. OTOH, Nvidia CEO Jensen Huang is already pushing software and cloud initiatives, including everything from chip rentals on the cloud to AI application support. It seems unlikely to me that Nvidia is going get stuck in the picks and shovels business like Cisco did.

And even if that were the case, let’s look at the timing of Nvidia’s eventual demise. Even if Nvidia is a one-trick shovel, how far along is the world-wide build out for AI data centers? As I said up-thread, Huang thinks that number is about 5% of the eventual build-out. That implies years of growth still to come.

I love reading bear cases like this, because they’re so short sighted. To think that only the current big GPU purchasers are going to remain the only big purchases seems very unimaginative to me. Huang has been talking about “Sovereign” use cases, which are countries setting up their own AI data centers for national security and other purposes. And I suspect we don’t yet know all the companies that are ramping up into AI in a big way. OpenAI and Tesla somehow didn’t make that list.

And the Tesla story really puts perspective on the “develop their own AI chips” bear case. Tesla hired Jim Keller and put a ton of resources into developing its own chip (named Dojo). Despite this, even after several years, Tesla is still buying Nvidia chips hand over fist (as is Musk’s other company XAI). The reasons are complicated, but the bottom line is twofold:

  1. It’s really hard to design a better chip than Nvidia, especially when Nvidia is itself shipping better chips every year.
  2. The chips Tesla, Google, et al are working on are being designed for their specific use cases, whereas Nvidia is designing general purpose AI usage. It’s certainly still possible that a future version Tesla’s Dojo will surpass a current Nvidia chip for Tesla’s specific autonomous driving use case with large visual data sets and configuration matrices, but Dojo isn’t and won’t be good for the LLMs that OpenAI and other companies are developing.

Sean Williams (who wrote the bearish article Bear linked), penned another bearish article back in March:

Those 10 reasons are all nonsense. I won’t waste time here outlining the flaws in each one - the usual competition (which is so lame the US government is looking into Nvidia’s effective monopoly), self-designed chips (see above), early winners always lose (but not just yet), and then valuation - which looks at trailing P/E instead of forward P/E. As we on this board know well, trailing P/E on growth companies is always high because you’re trying to associate future prices on past earnings instead of future prices on future earnings.

Not surprisingly, NVDA is up 38% since that article was published, lol.

That said, I don’t expect to be holding as much Nvidia 5 years from now. The law of large numbers makes growing at its current rate much harder. Microsoft, the largest US public company today, hit that problem before its successful migration to cloud and now AI itself.

Yet, anyone (like myself) who said to not invest in Microsoft because everyone who would use Windows is using Windows and then everyone who is using Office360 is using Office360 missed out on the company’s ability to compete with Amazon in the cloud (MS Azure, thanks to its cloud/desktop integration), and now with its ability to bring AI features into all its products. Sure they had mis-steps along the way (media players, mobile phones), but overall I have to admit they did a more than decent job of expanding their TAM with new technlogies. Does anyone here think Jensen Huang isn’t smart enough to do that for Nvidia, maybe even better than Microsoft?


Ok Thanks I see what you are trying to say. I still think the limiting factor is the Foundries. With NVDA competing with all the other fabless chip makers and how long it takes to build out more factories, NVDA will hit the limit of how many chips they can produce long before they run out of customers.



I’ve been following NVDA for a long time. Owned it many years ago but only reentered last August when I saw the data center revenue take off. Note, I bought my position after the stock had run from a low of $108 in October 2022 to more than 4x that. It can be psychologically difficult to buy a stock after a 300% gain. But I looked at what was happening with the numbers and thought about what was going to happen with the business going forward. So now we’re up more than 10x above that October 2022 low. But that low is completely irrelevant as is that price where the stock was a few weeks ago. So now we’re at around $1200 ($120 after the split), and the question remains how is the business going to do. I still have NVDA as my largest allocation position, but I’ve trimmed off about 30% of the shares that I had bought less than a year ago. I will probably continue to trim more shares on the way up, but those trimming decisions have little to do with where I think this business and the share price will go from here. I think it’s going to go up. Call me crazy, but I think the shares could double two more times, maybe three times in the next 5-6 years.

First, people are predominantly in this stock for the data center business that’s fueled by growing demand for AI and high performance computing. The data center business is the part of the business that’s growing like gangbusters so why not look at growth of the data center business separately rather than lumping in the other slow growing segments. This, I think, gives us a better view on where future growth might trend. The data center business revenue looks like this with the fiscal year, data center revenue, sequential growth (DC rev), and YoY growth (DC rev) show in the table below:

Q1 2023 3750
Q2 2023 3806 1.5%
Q3 2023 3833 0.7%
Q4 2023 3616 -5.7%
Q1 2024 4284 18.5% 14.2%
Q2 2024 10323 141.0% 171.2%
Q3 2024 14514 40.6% 278.7%
Q4 2024 18404 26.8% 409.0%
Q1 2025 22563 22.6% 426.7%
Q2 2025E 27000 19.7% 161.6%
Q3 2025E 31500 16.7% 117.0%
Q4 2025E 36000 14.3% 95.6%
Q1 2026E 40500 12.5% 79.5%
Q2 2026E 45000 11.1% 66.7%
Q3 2026E 49500 10.0% 57.1%
Q4 2026E 54000 9.1% 50.0%

Just putting in some estimates for data center revenue if it grows sequentially by $4.5B every quarter through Q4 2026 (is January 2026 or seven quarters from now) shows YoY growth maintaining at a high rate through this fiscal year and the following fiscal year. I think there is upside to these numbers if supply can be increased by more. I’m not worried about demand not being there for several reasons:

  1. We have information that all the hyper scalers and others are increasing CapEx spend on NVDA. TLSA has indicated $3-4B this year. X.ai has indicated $3B this year and $10B in 2025. Listen to the earnings calls of META, MSFT, GOOL and AMZN.

  2. If data center buildout is only 5% complete as Jensen has indicated, then after three more doublings we’re only at 40% complete. If NVDA maintains 80%+ market share then a 40% buildout is only 50% of NVDA’s potential. And data center demand won’t stand still but will keep growing so the TAM will exceed 100% of today’s number.

  3. There is strong evidence that demand exceeds supply. H200 will be available in a couple of months, and B100 will be available in about 7 months. So why are H100s still selling out when chips that are a lot better will be available in weeks? I think it’s because customers need compute and they need it yesterday (see #4 below for why I think this is the case); so they say “give me all the H100s you can give me and I’ll also take the H200s and the B100s”. I think this will continue through 2027 with the B200s, the R100s, and the R200s all of which are already in either full production (H200 and B100) or full development (B200, R100, and R200) as per Jensen’s keynote at Computex last week.

  4. Why are customers so desperate to get all those GPUs? “The more you buy, the more you save.” is true. Also, the hyper scalers are investing $1 today (for GPUs), and they will get $4-5 return on those GPUs over the next several years. ROI is there. But I think the most important reason for this arms race or mad scramble isn the following: the first company or country to get to general artificial intelligence will have such an advantage over others as the world hasn’t seen before. For companies, it will be the revenue generating opportunities related to AI which will dwarf any advantage the world of business has ever seen. Just compare humans against the next highest intelligent creature (I think it’s the orangutan). Humans have accomplished many billions time more than orangutans. AI will be to humans as humans are to orangutans. Yes, I think AI is a way bigger deal than electricity, the transistor, the internet, and mobile. But each of these previous innovations is a prerequisite (maybe not mobile so much) to making AI possible and useful. I think AI is happening faster than anything before, and since people have a hard time thinking in exponential scales, it’s hard for most to imagine how big AI will get and how fast that can happen.

So, NVDA growth is slowing as Q2 FY2024 was the inflection point and that quarter is now being lapped (next NVDA earnings report). But even if NVDA’s data center grows only linearly on a sequential basis (as I’ve shown in the above table), we still get amazing growth this fiscal year and next. Am I selling my position? No way Jose!