Anticipating a reversal in AI/semiconductors

Yesterday, NVIDIA was up 24% by the end of the day (and as much as almost 30% during the day).

AMD was up 11.2%

TSMC was up 12%

ASML (Dutch company that makes the machines for TSMC etc) was up 6.25%

IMHO, this is nuts. I expect this semiconductor/AI mania jump to reverse with some profit taking, though whether it will be hours, days or weeks, who knows?

I note banks are rushing to revise their price forecasts for NVIDIA up to levels that are absurdly high relative to earnings, assets etc.

Keep in mind, we are still in ‘the worst market for PC sales since the 90s’.

6 Likes

Nvidia is not your father’s PC company. Here are some data for NVDA (one quarter old before the most recent AI mania):

“The chip company serves five primary markets—gaming, data center, professional visualization, automotive, original equipment manufacturer (OEM) and other—and provides a revenue breakdown for each of those markets:
– Gaming revenue, comprising 45% of total revenue, rose 41.8% YOY in the third quarter;
– Data center revenue (41% of total) grew 54.5% YOY;
– Professional visualization revenue (8%) was up 144.5% YOY;
– Automotive revenue (2%) increased 8% YOY;
– OEM and other revenue (3%) expanded 20.6% YOY.”

DB2

4 Likes

Thanks, though I’m not entirely sure what the point is that you’re making?

I’ve been following NVIDIA closely for 25 years now since the days of TNT2, I’m aware of what they do and how they do it, but I appreciate you sharing the information you have. I’ve even had one of their engineering managers reach out to try to recruit me for one of their AI teams. I am quite ‘deep’ into this stuff.

Since you’re replying to my comment ‘worst market for PC sales’, did you know that NVIDIA have been struggling to shift their glut of 2020 inventory for almost 3 years now?

Or that their latest products (e.g. 4060 ti) perform worse than the equivalent inventory (3060 ti) from 3 years ago, for the same price? (I believe such a situation is almost unheard of, in the history of GPUs)

How are they going to sell this junk? Only an idiot would buy their current products.

Here’s a particular problem that is a well known and major topic of discussion among the gaming community currently:

Current generation consoles provide 16GB of graphics memory (in a full system) for around $500. And major games are often designed to suit consoles, then adapted for PC.

This is important because games sometimes don’t work quite right (or have to have quality settings turned well down) if they are designed for 16GB graphics memory, but only 8GB or 12GB is available.

However, NVIDIA doesn’t have even a single graphics card on the market for under $1100 with 16GB of ram. That’s not even the full system either! Compared to the $500 all-in-one consoles - that’s just the graphics card bit! You need another $800 of PC in addition, just to make use of it.

EDIT: and to make things worse, NVIDIA’s ‘awesome new features’ like Ray-tracing and DLSS, meant to improve image quality or framerate, require extra graphics memory. Compounding the ‘not enough graphics memory’ problem. So people can’t actually use those supposedly wonderful features on the current generation of cards with the current generation of games. It’s a joke/nightmare, depending on whether you just bought a card.

Both of NVIDIA’s competitors (AMD, Intel) do have 16GB + cards for under $1000 with competitive performance for the non-memory side of things.

Intel has one at $349 (ARC 770) and is currently cutting prices further to grow their market share. AMD is currently selling a range of cards 16GB+ under $1000 (7900XT, 6900XT, 6800XT).

To the gaming community, NVIDIA’s current offerings are a complete joke! Many are boycotting ‘ngreedia’.

As an enthusiastic gamer with money in my pocket to spend, who has an actual use for a high-end GPU (virtual reality for fun, AI research) there is nothing that would make me buy any card that NVIDIA offer right now unless I had some specific use case justifying $1200 for the 4080. But for AI stuff I can just rent in the cloud cheaply by the hour anyway, or run it on a CPU if I have to.

Take a look at /r/hardware: a technology subreddit for computer hardware news, reviews and discussion. and count the nvidia product reviews that have phrases like ‘don’t buy!’ or ‘a waste of sand!’. The company is ‘troubled’ and it’s all down to pushing margins to the absolute limit.

Either the margins collapse, or the market share does, and neither will do the company’s accounts any favours in the next year or two. Intel are about to launch a new high-end card (Battlemage) and hopefully it will drive prices down significantly.

As for AI spend, a lot of business people are going to figure out 10,000 x GPUs don’t come cheap and aren’t actually that useful for a lot of things, unless you really need to train a whole new model just for you and can’t fine-tune an existing one. Even so, why not just use the cloud?

14 Likes

I don’t know the details of the company or how many idiots are out there, but sales have more than doubled since 2020 (and before). Could they not just write off three-year-old inventory if it’s not moving?

As for income, their net income for the last four quarters was $4.4 billion. Pre-pandemic it was $2.8 billion. Something seems to be going right.
https://www.macrotrends.net/stocks/charts/NVDA/nvidia/net-income
DB2

3 Likes

What would the impact of ‘writing off a huge amount of their primary product as unsaleable’ be, for a company on a PER well over 200x, do you think?

Adjust for inflation, adjust for a temporary mania in AI this year with a bunch of high margin products that don’t get bought very often (plus the latter stages of the crypto bubble, still affecting sales Q1/Q2 last year).

The business is cyclical, and even as cyclicals go, that’s not a very impressive % change.

1 Like

I have no reason to question your expertise, but I can’t say that I have seen this boycott in action.

I am also a gaming enthusiast but I am mostly a tech neophyte so I visit a lot of sites before I decide to make a purchase or upgrade. My last two systems were custom built to order. I bought a new system last year and I used CNET and few other sites to help me in my decision-making process. When I checked their site today, Nvidia is still receiving many of their Editor’s Choice Awards and other recommendations (4 out of 5 Awards with AMD getting 1 out of 5).

7 Likes

The revision of price forecasts by banks or financial institutions can occur due to various factors, including changes in market sentiment, company performance, industry trends, or new information that impacts the outlook for a particular stock. These revised forecasts may sometimes appear optimistic or high relative to certain fundamental metrics like earnings or assets.

2 Likes

You did a great job on the PC end of the business Luxmain now could you give us the Data Center side of the business? They are guiding for a 50 percent increase in Revenue for the Data Centers sequentially.

Andy

4 Likes

There seems to be a mania for spending on GPUs for the datacenter in the current quarter, as every company in the world apparently announces at once that they are now an AI company (e.g. to give an example of hype: Elon Musk announcing he was buying 10000 GPUs for an AI project).

If I was the CTO at any big corporation, I’d be using this as an opportunity to fill my datacenter with cool widgets and add extra staff, while the CEO was in the mood to do so.

But that will mean less spending in future quarters for 3-5 years on similar tech.

It’s not any different to the burst in spending on GPUs in 2020 during covid for people to game on during lockdown, a huge spike in prices that came with that, followed by, 3 years of literally not being able to clear out the stock they produced for the 30x0 series. (Meanwhile: competitors AMD & Intel were able to move on with newer semi fab processes that offered better performance).

If Nvidia gets this wrong they will be left holding a ton of rapidly-going-out-of-date inventory, again, which will take years to clear, again. (And will act as an impediment to them pushing forward with new lines, which gives other companies time to catch up or take a lead).

Also, I can tell you for sure that any company capable of designing or fabbing a tensor accelerator chip (and it’s not hard, they have limited operations, and run at fairly tame clock speeds), will be doing so.

And since Apple, Amazon, Google, etc all have their own designed and working tensor chips, I expect they will scale those up and offer very competitive cloud prices.

In fact it could turn out even worse for nvidia on a 2-3 year timeframe than the GPU overbuild of 2020.

Consider: Competing in GPU is hard because you not only have to build the card, you have to build drivers that can support the last 10-15 years worth of games and gaming APIs, because people still play those games and will be furious if they don’t work exactly right.

Games are fairly complex beasts, and very heterogenous in terms of implementation, and full of performance bugs, so there are a million edge cases you have to cover. As Intel found out in 2022 with their new line. Nvidia had an edge here because their drivers for GPUs have matured along with games for the last 20 years or so. That is a very real technology moat that is hard for competitors to catch up with.

Competing in tensor compute is laughably easy in comparison. No one uses old APIs, there’s hardly any code to be run much older than a year or so. You have relatively few functions to implement, and you can probably just focus on the current versions of e.g. pytorch, CUDA and not worry too much about performance regressions on older API versions/platforms. After all, who’ll be using them now?

In short they should have an amazing Q2/Q3, but if we are hitting recession in Q4 they will be in deep trouble, otherwise I suspect they will be probably in trouble by Q2/Q3 next year. It’s also possible that we see a random strong economy in general, and no recession, in which case good sales will probably hold up for 1-2 years without too much inventory glut, but in that scenario, there will be more long-term competition too, particularly from Intel and cloud operators.

6 Likes

Hi @insurtech
R u a machine?

asking for a friend

9 Likes

So NVDA sells Cuda to other companies correct? So they should be getting a lot more money for their software. Are they selling this as a subscription, or is it a licensing agreement? Thanks for you help.

Andy

1 Like

No, we don’t, as far as I know. But I believe others have tried to do so independently.

CUDA - Wikipedia.

3 Likes

Sorry, slip of the brain - I mean openCL. You wouldn’t literally implement CUDA since a) NVIDIA own it b) you don’t need to.

OpenCL - NVIDIA and AMD and Intel (and various others), and you can run it on a CPU as well as a GPU.

CUDA - NVIDIA GPU only.

Various custom stuff - Google, Amazon, Mac, AMD etc.

AMD for example use their own ‘ROCm’ library to provide compute via GPU, then provide a layer to e.g. pytorch to run on ROCm instead. You just pick out the popular applications and put your own tech underneath them instead of CUDA.

There are some applications though, for example, biotech applications, which are designed specifically for CUDA, but generally there’s so few users that it would be crazy for AMD / Intel etc to implement a translation to match them to their own libraries.

It would be quite interesting if NVIDIA were forced, or chose, to license CUDA more broadly. I don’t think that will happen.

Biggest problem with e.g. ROCm was a) it was late to market b) it’s mostly linux only afaik.

Honestly though, for most people they are agnostic to what is happening underneath the hood anyway. You could offer AI transformers / training as a service and have the whole software stack underneath running custom code on some google cloud tensor chip. It’s not like anyone will ask ‘is it nvidia? is it CUDA or openCL?’.

I mean when you type posts on this site you don’t worry too much if it’s AMD/Intel on the server, if it’s Windows or Linux or Mac or FreeBSD, right? DL tech/training etc is already very much at the stage you can be that ‘blind’ to what’s going on underneath.

e.g. Do you know what software stack / drivers / hardware chatGPT runs on? Would you care?

4 Likes

No I get your point luxmain and I agree but when Cisco first came out with their routers I did care on whether it was Cisco or some other tech running. Everyone wanted Cisco because it was easier to work on. Now not so much. So is Cuda easier to work on than the other software competitors for the Engineers that are using it or does it not matter to them also?

Andy

1 Like

As far as AI goes, very few ordinary people are writing anything for AI in CUDA. Everyone is using a higher level tool like pytorch, or higher still, e.g. frameworks, web services or whatever, that do the pytorchy stuff for you, then the pytorch does the (CUDA/ROCm/whatever).

All the work is already done, you wouldn’t bother doing it again unless you were studying it for fun. Or doing something very novel, e.g. research into AI techniques themselves.

From the end user or ordinary programmer point of view it makes no difference at all what is running underneath pytorch.

2 Likes

I don’t know if you read my other post about nvidia. But the real issue isn’t the growth. The issue is that the stock is so wildly overvalued, almost any amount of growth wouldn’t make up for it.

Consider. The price to sales ratio is 35x at today’s price

Suppose their (very expensive) workers work for free, their (extremely expensive) suppliers charge nothing, they don’t pay taxes ever again

they’re still on a P/E of 35x in that magical dreamland

the tech is irrelevant

the story is irrelevant

the hype is irrelevant

if the price was $50-$100 we could have a meaningful discussion about whether the tech and hype would justify a position despite it being a bit pricey

around $400? pure insanity

imho, of course

4 Likes

I did but I was more curious about your look on the Tech than the valuation since you were being recruited for AI. I realize that people are getting excited by AI and NVDA, but they are showing that the Data Center part of their business is starting to take off. The problem is if you are correct and it is only 3 quarters or so that could be a problem, but if it is 10 years of growth that makes it more interesting. Maybe not at this price but everything fluctuates.

Andy

2 Likes

Looking at growth, NVDA has been doing well. Do you expect it to significantly slow over the next 3-5 years?

DB2

3 Likes

But lets not fully forget crypto. It is on a four year cycle which is shifting backward over time.

Very curious how is that playing with Playfab, UE5 and Unity? Is there more streamlining?

The last important thing to look at our global economy is not doing so well. Valuations are forward looking. After this poor economy we will go into better times. If this is now…what are the next ten years?

ditto

2 Likes

Unity is useful but also can be a nightmare for game devs, because it notoriously requires a whole separate team just to manage the process of keeping up with Unity’s random arbitrary changes of direction in how things work. “Unity” isn’t a singular thing; it’s a series of releases of game development platforms that sometimes are significantly different.

Although I suspect the popularity of Unity e.g. among indie devs particularly has probably reduced the number of ‘random graphics glitches’ problems among games that use their framework/platform, the overall number of games being produced in recent years has increased and the number of frameworks for doing so has increased too, as well as the number of versions of those frameworks.

The real issue though my post was referring to is ‘recent history’ gaming e.g. games from 2000-2020, which are sometimes still very popular but still require GPU acceleration to play.

The thing that is surprising is that nvidia and AMD have had to patch bugs in games themselves (e.g. recognising the game that’s playing, and adapting how code or GPU behaves) in order to obtain good performance and not glitching. You would think the game devs would have to do it themselves, but doing it for them has provided a clever moat for NVIDIA over the years.

3 Likes