NVDA: not awesome news, or unexpected


I haven’t looked into more detail here, but looks like the cards will put pressure on Nvidias Tesla cards. The scalability factor and power consumption factors will be increasingly important.

I don’t own Nvidia anymore, but this would be a significant concern if these cards (and presumably others) hit scale.



I’ve been concerned about this trend for awhile now, and glad I sold out of NVDA late last year. The article is correct in that the GPU is, by accident, good at AI and ML. But there is a lot in the GPU architecture that is required for rendering pixels but has zero value for training an AI. I even told an Oracle hardware VP a few years back that the GPU is not ideal, it’s just much better than the CPU. It is worth noting that Apple’s A12 Bionic chip has a neural network engine rather than yet another GPU instance. This is why we have Tensors, is why some go for FPGAs, etc. Heck, Mythic is even going with analog circuitry for inference use cases (don’t laugh, it makes sense). We are witnessing the decline of the GPU people.


Wow, this is great you guys.

Did either of you catch the news of some new chip design in Britain? There was a story on Bloomburg Tech news the other weekend about a new chip design that somehow dispenses with the calculation to the hundredth of the inch for calculations where a greater scale is sufficient, like on automated driving.

They thought it looked really promising and I can see where it would be, but if you have any thoughts…


We are witnessing the decline of the GPU people.


Over the last decade there has been a deep-learning revolution as researchers discovered that the performance of neural networks keeps improving with a combination of deeper networks, more data, and a lot more computing power. Early deep learning experiments were conducted using the parallel processing power of consumer-grade GPUs. More recently, companies like Google and Nvidia have begun designing custom chips specifically for deep learning workloads.

Since 2016, Autopilot has been powered by Nvidia’s Drive PX hardware. But last year we learned that Tesla was dumping Nvidia in favor of a custom-designed chip. Monday’s event served as a coming-out party for that chip—officially known as the Full Self-Driving Computer.

Each chip has two compute engines capable of performing 9,216 multiply-add operations—the heart of neural network computations—every clock cycle. Each Full Self-Driving system will have two of these chips, resulting in a total computing capacity of 144 trillion operations per second.

Tesla says that’s a 21-fold improvement over the Nvidia chips the company was using before. Of course, Nvidia has produced newer chips since 2016, but Tesla says that its chips are more powerful than even Nvidia’s current Drive Xavier chip—144 TOPS compared to 21 TOPS


One very powerful reason to sell, is you sell even the Gorilla, if the Gorilla becomes disrupted.

Lets take AOL. AOL was the king of dial up. They fell down the tubes when the next platform of broadband became mainstream. We talked about this in real time on the Fool, so this is not retrospective. You could see and predict it in real time. The Fool did not have such a rule for sale. Nevertheless, it works almost every time.

When the PC went to the internet, Microsoft had to take out Netscape because the internet was an existential threat to Microsoft’s OS dominance. This bought Microsoft time, but when the world went mobile the value of the OS dissipated and Microsoft went into decline. Unlike many others Microsoft was able to reinvent itself and is better than ever, but no one could have predicted that.

Same thing happened to Intel as they missed out on wireless.

Here with NVDA the GPU is being disrupted. Nvidia dominates the GPU. Nvidia will not have the same competitive advantage if this disruption becomes mainstream.

Thus, one rule of sale, disruption. Don’t hang on to the past as with each net technology platform there is almost always a new leader with the former leader struggling to stay relevant.

This appears to a real risk for Nvidia now. I know many are holding for a turnaround, but the risks increase when in the real world truly disruptive product is coming to market and being adopted. It remains to be seen how effective the new tech will be, and how Nvidia will respond, it certainly increases the risk to Nvidia, and at least to me, makes holding it something less desirable than it otherwise would be.



You could see it in real time as well. I sold out of NVDA early last year. It was time to move on. I stupidly played earnings in October buying back Nvidia for a day or two and losing 20% on the earnings debacle (thought things just could not be so bad - breaking my own rules to do so - sometimes you need a sharp reminder why you stick to rules), but yes, in real time it was identifiable from the peak of its share price.

Did the same thing with Arista, again at its peak, at near the same time, for reasons discussed ad nauseum.

Point being, nothing is perfect (except maybe Alexander the Great’s battle record, and Caesar & Constantine didn’t do so bad either) but with good efficacy you can spot such early enough.

Arista’s was very easy to see when results did not march narrative, but we got a nice bounce into the $300s when it made Tesh S&P 500.

The Fool does not like to sell because I also sold SHOP (too early) but so what. I got rid of the other to be dogs (not disrespectfully but just to get to the point) and bought into other investments. Selling SHOP did not meet my sell rules, but SHOP always made me nervous. So you sell one too early but got rid of the others at their peak. I did add to Mongo, as an example.

Does create the hypothesis that rules to sell are as important as rules to buy, which does put a plug into the greatest criticism of Fool methodology. Of course everything I buy I buy to hold forever, but forever comes to us in different time frames.

I did not really screw up on Nutanix, other than buying into too many indirect statistics. The one that stood out to me however was marketshare gains. But that lasted one whole quarter and then God’s demolished. So, lesson learned, avoid the gobble gook. If you cannot even define what the apple is, much less compare apples to apples (as you simply cannot do now) then why deploy valuable resources to an entity that cannot evenin a straight forward manner define what the Apple is to begin with?

Duma and I and Phoolio had an excellent discussion on this. Some difference of opinion f Phoolio, but right or wrong Duma and I appear to be in agreement. It is not so much Nutanix as to what you could otherwise use the resources to invest in.

If anyone can truly define the true apPLE that one can compare YOY - so we have an apple to Apple true comparison that allows us to understand (in a straight forward manner) that Nutanix is not really selling less, it is just a mirage of accounting, please let me know. Bert tried…in my mind, despite creating enthusiasm, and doing his usual great job, I think he failed to make his point in a convincing manner when you dig in to the details,

Opinions differ of course.



Please ignore typos. God’s was suppose to be got,
Etc. Typed it on a tiny iPhone SE. Dang auto correct. But you get the gist.

Great info. I too sold my NVDA position.

Yet the stock price up over 5% today. So who is missing what?

The important thing with a lot of “Nvidia Killers” is the context of what they put out. I’ve seen a lot of cherry picked data that had a lot of key details hidden away in footnotes that changed completely the claims they made.

I don’t think Hibana has released any data about the chip outside of a graph with claims and no context. I find it odd to claim that Volta doesn’t scale much past 16 GPUs when Summit (among others) runs 26,000+ of them in parallel to achieve greater than exascale deep learning.

No position in Nvidia for a long time, but it wasn’t because somebody had created an “Nvidia Killer”. Still haven’t seen one. Maybe Habana will be it. But they have not released any proof it is.


If there is a lesson to learn here, I think Habana and Tesla are pointing the way. The take away: anyone with a need for large quantities of AI chips will be able to develop custom solutions that achieve higher performance at lower cost. Tesla strongly suggests there is no market for Nvidia in autobots. The hyperscalers/cloud titans are undisputed experts at driving extreme hardware commoditization, any market for Nivida silicon in those markets has a limited shelf life. It was a happy accident that GPUs happened to work pretty well for AI training, but GPUs are not the optimal solution. Or so it seems…

1 Like

Tesla strongly suggests there is no market for Nvidia in autobots.

And then again there’s this from today…


Nvidia is proving why it’s one of the leaders in the autonomous driving movement…Nvidia has been steadily increasing its partnerships over the past few years. Most notably, the company continues to increase its business with Daimler (DDAIF) , the parent company of Mercedes-Benz. And when Nvidia announced that its DRIVE Constellation simulation platform was available to customers back in March, it also announced a partnership with Toyota.

Don’t know that I’d trust Tesla’s view of Nvidia’s market potential.

I’m not saying NVDA should be a top 10 holding, but many here seem to feel Nvidia is dead and should be buried…a little early for that.


This type of stuff would all be old news to me. I mentioned the risk that the GPU was not “good enough” in 2017 when Chamath Palihapitiya mentioned on CNBC that there was increasing competition in AI hardware as all the large tech “hyper-scalers” move into AI.

I believe Chamath Palihapitiya said this exact quote, “Artificial Intelligence will never be a single vendor environment in three to four years.


I was told by most people that this was untrue and Nvidia had no competition…but with everything in investing, “It is not where the puck is but where the puck is going” and it only took very light reading to understand that GPUs were flawed for long term use in AI. GPUs are better than CPUs but GPUs still have flaws…for one GPUs use too much power and are inefficient.

There are a whole host of things coming down the pike that will likely supersede even what Habana Labs and what many other ASIC players are currently doing. There are neuromorphic chips that might arrive in 4 to 7 years: https://amazingherald.com/neuromorphic-chip-market-increasin…

Then there is Alphabet, who is investing/experimenting with several different things like

  1. Optical AI chips - https://www.cnbc.com/2019/02/25/alphabet-gv-invests-in-light…

  2. Creating the next generation of artificial intelligence applications by creating a new hardware and software platform entirely from scratch: https://www.forbes.com/sites/jilliandonfro/2019/04/01/samban…

Last but not least…where things might ultimately be headed over the course of 20 to 40 years is biological AI chips: https://face2faceafrica.com/article/nigerian-neuroscientist-…

I was mentioning this stuff in 2017/18 but most of the Groupthink on the Fool paid service’s board was still mostly believing “GPU Über alles” but I stopped arguing it. I simply sold my Nvidia and simply waited for the inevitable to become obvious and when it did become obvious, it was the realization that Nvidia was not going to become the “Intel of AI Chips”, along with other parts of Nvidia’s business weakening at the same time which led to the big price drop in Nvidia but I was well out of Nvidia by the time it started dropping.

Why did I believe it was “inevitable” that GPUs was not the way to go? Well, it only took a little bit of Googling and reading article’s by experts on the subject to discover that GPUs had serious flaws to be any type of long term solution for AI applications.

I will give people a clue when looking at AI hardware for the future. It will be a race to for AI chips to be increasingly more powerful, faster, and less energy intensive . I would also not “Fall in Love” with any one company or one technology in the space because things should move very fast in the space with maybe one technology at the top of the heap today and that same technology obsolete in 3 to 5 years. Anyone that invests in the chip space has to keep close eyes on what is going on.

If our scientists truly want to build some really Sci-Fy stuff like Data on Star Trek in the future then they will have to eventually move toward building a extremely powerful computer that consumes maybe as much power as the human brain (The human brain consumes 20 W). Data on Star Trek did not have umbilical cord wires connecting Data to a power source. In order to build a robot without umbilical power cords that can last for long periods of time, one either needs a immensely powerful and compact power source and/or a extremely low power AI central nervous system. (A low power way to move the robot’s “limbs” is also necessary…which is why I think eventually we will build biologically based robots or androids…robots that use hydraulics are far too power intensive and will likely always require umbilical cords)

Humans are far away from building a computer on the Star Trek Data level but that is probably a long term aim and a problem that never really ever was going to be solved by GPUs. Long term, our computers might eventually not even be silicon-based either.