NVDA: GauchoChris update after Q318

NVDA reported Q3 2018 results on November 9th. We got a quantitative update on the financials and a qualitative during the conference call. I found the call very informative and extremely interesting. I must say that I am very pleased with the results, but I am even more optimistic about the growth that’s coming for NVDA in the next few years. NVDA is participating in some of the world’s fast growth markets and it has a huge headstart in addressing those markets. If I could summarize why NVDA is so special, it’s because the company has developed a brain to solve problem that human brains cannot solve. Some of these problems are problems that were previously thought to be unsolvable. The company increases the power of these brains that it provides to the world with every new product introduction. The processors get more powerful and the software (CUDA) that sits on top of the processors get more powerful. This is the double exponential that CEO Huang talks about. The company has a 5 year product roadmap and it has thousands of engineers working on the next generation products; they are working on more than one (Huang said several which could mean 2 or 3) and if it’s a 5 year roadmap then that means that we can expect a new product every 1-2 years. On top of that NVDA has 5000 software engineers working on improving CUDA. Holy moly!!! Think about that for a minute. NVDA already has a 7 year head start on the competition for machine learning and inferencing. How in the world could another company ever catch them??? Seems like a snowball’s chance in hell. Even if some start-up company could come up with a superior chip architecture then they would still need the software to run it, the manufacturing to build it, the channels to get it to market, a legion of developers to build on top of it. Any one of these would take years and is not simple and all of them together make it virtually impossible for a competitor to catch them. We would see a new competitor coming from a mile away before it would affect NVDA’s business. And meanwhile, NVDA will keep advancing its products and its technology and its software….while growing its footprint in the fast growing markets and building its war chest of financial resources. NVDA could also just buy such a company while it is still small; the cash outlay of such an acquisition would not even dent NVDA’s balance sheet. So that leaves the existing competitors: AMD and Intel. NVDA has a 7 year head start. AI and autonomous driving and robots are markets that will be taken between now and the next 7 years. A 7 year head start means that the game is already won. This is why I have a large allocation in NVDA. It is why despite the rise in share price, I have not sold any shares and I don’t intend to sell any shares. NVDA is at the center of the AI revolution that is probably the single largest impact event/technology in the history of humanity. NVDA is a $130B company now and I usually only invest in smaller companies because I want to be in companies that have a chance to grow in value by 10x. So NVDA is an exception. APPL is still the largest company in the world at over $900B. Can NVDA grow 10x from here? I really don’t know but I think it is possible. NVDA is currently distributing the brains that it makes into datacenters and into servers. It has now fully established the distribution to capitalize on this very fast growing market. The datacenter business now has a $2B run rate, and it got there at a faster rate than any of its previous products. Now that the Volta is launched and that its distribution channels are fully in place (cloud and servers to seed the market with AI computing power), I believe that this business may even re-accelerate from here. Huang on the conference call said “We have primed the pump.” Pump is now primed in ALL the major the datacenter cloud providers and ALL the major OEM server providers. We will see how they do, and I think its going to be really, really, good.

So now we have NVDA’s brain products growing rapidly in the centralized form (datacenters and OEMs). The AI enabling chips are in the data centers and the servers. This is centralized. The next big opportunity will be to use all the machine learning that’s being done in the centralized way (in servers and in the cloud) and distribute it to individual devices for the inferencing. It will be like putting brains into devices (decentralized). Every camera and every device that moves and every device that needs to make a fast decision can get an NVDA brain. It my take a few years for this steamroller to get going but once it does it’s going to be A LOT bigger market opportunity than the brains that are currently going into the centralized cloud and servers. Every vehicle will get a brain. Every camera that’s monitoring to keep the world safe will get a brain. Every drone. And who knows what other sensors will get them.

In summary, there’s never been a computer brain more powerful than the human brain. NVDA is making it possible and not just possible. It is happening right now. There’s nothing else like NVDA.


15.6% allocation plus 0.29 shares controlled through in –the-money call options for every share that I own, making my position with options more like 20.1%. But since the call options were paid for with short puts that have since expired worthless, my downside risk is less than actually holding a 20.1% position


Computation complexity of the brain won’t happen in our lifetimes if ever. Moore’s law is dead. DNNs have a hard time simulating basic multicellular life. NVDA is great and all, but let’s keep things realistic.



Computation complexity of the brain won’t happen in our lifetimes if ever.

This has nothing to do with NVDA’s future success. Even so, you are making an assertion that will probably turn out to be false in a few years. But, again, that completely misses the point. NVDA will put brains into devices and these brains will have sufficient computing power and inferencing skill to do the jobs that they need to do. Such jobs were not possible a few years ago and many jobs are not possible now but will be in a very short timeframe. Also, such jobs are incredibly useful and have value that will generate tons of revenue for NVDA. Huang mentioned on the conference call that NVDA will probably make about $2000 per self driving car. If there are 20 million cars sold per year that equal a $40B revenue per year opportunity. NVDA revenue run rate for its entire business is currently around $10B. Self driving cars are coming in 2021 or 2022. I believe NVDA will make much, much more money off of brains in devices, cars, robots, drones, etc than it will make from processing units sold into data centers and to OEM for the servers.

Moore’s law is dead.

NVDA’s CEO says the same thing, and it’s one reason why NVDA’s architecture is winning. If you listen to the conference call you can hear Huang say why this is true.

DNNs have a hard time simulating basic multicellular life.

Not sure how this is at all relevant to NVDA’s current and future revenue streams.



“Not sure how this is at all relevant to NVDA’s current and future revenue streams.”

I don’t know. You mentioned in your first post, and then you told me, “Even so, you are making an assertion that will probably turn out to be false in a few years.”

I guess you are the expert. Best regards.



I may be right or I may be wrong on this point. But it doesn’t matter for my investment. It matters how useful the applications that use NVDA’s GPUs will be. The chips will have specific tasks such as taking in data and making navigation decisions (for the self driving car example). For this example, it will be important how faster the chip/brain can process incoming data and then how good the brain is at making decisions based on this data together with past experience (machine learning training). NVDA’s Pegasus PX Drive can do 360 TRILLION operations per second!!! In addition, a given car’s brain will have been trained using billions of scenarios. My comparison a human brain can process nowhere near that speed. Also, a human brain is much, much , much more limited in how many scenarios it can be exposed to. This is why the brain in a self driving car will be able to make better decisions, faster decisions and faster execution of the decisions than even the best human brain.


1 Like

GauchoChris, Bound your enthusiasm! Progress is being made but not at the rate you seem to have bought into. As an old engineer that was developing array processors similar but orders of magnitude less powerful than the current NVDA systems for almost 30 years. I remember working on a proposal to DARPA (Defense Advanced Research Projects Agency) for the development of specialized array processor for AI as far back as 1989 with a team of companies. Believe it or not the fundamental concepts haven’t changed dramatically in the last 30 years. The primary changes have been the steady progress more powerful hardware which in turn lead to better and more sophisticated software tools. NVDA is without a doubt the current leader, scanning the AI blogs it is obvious when anyone selects an environment to start a project they invariably choose NVDA. It does indeed look like they have an advantage in their leadership in both gamming and ML. But I have seen many supposed advantages like this disappear in just a few years in the past.
I in no way have a crystal ball but it appears to me the big future profits will not be from the big mainframe AI systems that Google, Apple, Intel, IBM and others are rushing to install in the big data centers but rather in the smaller specialized units . . . self driving trucks, cars, drones, robots, language translators in addition to many items we haven’t even dreamed of. The problem for NVDA is that each of these products have enough potential profit that specialized hardware with their own unique chips optimized to the specific problem can be developed independent of NVDA.
I’m not suggesting that NVDA isn’t a buy just don’t depend on it being another Apple.


Hi Chris, great write up.

I have only one small caveat. You wrote:

But since the call options were paid for with short puts that have since expired worthless, my downside risk is less than actually holding a 20.1% position.

I believe you know better than that. Your short puts have already expired and you have that money. Your risk runs from now, not from back when you originally took the position. If you have the equivalent of a 20.1% position now, your risk is that of actually holding a 20.1% position, because, that is what you are doing.

That’s the way I see it, anyway.

Thanks so much for the write-up!



You asked: How in the world could another company ever catch them???

I believe that if they were to be caught, or passed, it wouldn’t be by anyone following in their footsteps, but by someone taking a different approach altogether. Nvidia has seven years, productive years, of following one path. It may not be the only path, and it may not be the best. Someone else may innovate their way into the lead.

I am not taking anything away from what they have accomplished, and I think their momentum is great, but I thought your question deserved a response. Me, I’m long NVDA and very optimistic.


I believe you know better than that. Your short puts have already expired and you have that money. Your risk runs from now, not from back when you originally took the position. If you have the equivalent of a 20.1% position now, your risk is that of actually holding a 20.1% position, because, that is what you are doing.


You are correct. Almost. I now have 2 separate risks. The risk of losing money if the shares drop and then risk of losing value in my options. The calls options have $150, $160, and $170 strike prices so the downside risk of these options is a bit different than than simply holding an extra 4.4% allocation of stock. On these options, there is the risk of the decaying time value of the options and the risk of losing the in-the-money portion of the value. There is also the risk of a temporary stock price drop within the timeframe until option expiration in January 2019. On the other hand, my maximum downside on the options is less than if I owned shares instead. The upside profile looks different too. Yes, I could sell the 4.4% share controlling power of the options to buy shares but my share allocation would then be well below 20.1% because the options have less value than the shares.


1 Like

I’ll start out by saying I’m long NVDA.

But, I’m not nearly so sanguine about the future of the company. As usual, the future remains damnably hard to predict.

Let’s start with the roadmap. I worked at a company that built products with some of the longest life cycles of any industrial product sold (I exclude jewelry and musical instruments, some of which remain serviceable for centuries). This company maintained a 20 year roadmap. I worked for the company for 30 years. At one point I had to gain in depth knowledge of roadmaps and their functions as I was in the position of evaluating roadmapping software (yes, there are s/w products for this purpose).

A roadmap is at best a guess, it is not a commitment to bring specific products to market. One of the main functions of a roadmap is to document and track outside technologies and events that impinge on the potential development activities of the company. For example, if you want to build zero carbon impact airplanes, you are highly dependent on the development of alternative fuels. They might come from a framed plant, or come from algae, or some other source, but wherever, whatever the development of the fuel is not part of your development path, even though you are 100% dependent on it to achieve your goal.

Then there’s the market for your products. You might anticipate a market at some point in the future, but it might not materialize, or at least not in the manner in which you believe. The company I worked at at one time was spending $1M a day developing a product that was never brought to market. This happened more than once while I worked there.

What might upset the apple cart for NVDA? Well, if I actually knew, then I could at least stay alert to potential warnings, but just a few examples that come to mind - purpose built processors. NVDA does design and build for applications, but the same gpu is at the heart of all of it. Their gaming market, crypto market, autonomous vehicle market, cloud market all revolve around the same GPU. I can see the possibility for each of these spaces getting upset by a purpose built processor that addresses the singular function better and more economically than NVDA.

And the notion of an NVDA brain in just about everything that moves (or in some cases remains stationary). Why put an expensive brain in something when a much cheaper one is good enough?

A moat so great it can’t be breached? Think about Cisco, Intel, AOL, Yahoo, DEC, Novell, MySpace, Burroughs, Vector Graphics, Sun, EMC, Compaq, CDC, Four-Phase, Wang, SGI, Tandem, etc., etc. In my lifetime the world is already littered with dead or nearly dead technology companies that once had an “insurmountable” moat. NVDA’s vulnerabilities may not be obvious, but don’t for a minute think they are some kind of Goliath can’t be slain.

You might be confident that the management team is sharp, they will see threats before we do and react appropriately. Don’t bet on it. First, they are human. So far, NVDA is not using their own h/w & s/w to make key company decisions. Humans make mistakes. Humans get sick. Humans get into accidents. Humans sometimes change their life goals. Humans get hurt. Humans are not entirely reliable.

Just keep your eyes open - a fall from grace is inevitable. Creative destruction is the way of the world and it operates at an accelerated pace in the technology space.



I disagree somewhat with our analysis. First, one of Nvidia’s strengths is that no one else can keep up with their product cycles. As an example, AMD’s new chip is both slower and more energy hungry than the chip that Nvidia rollout last year, much less its new chip that will roll out early next year.

Second their product cycle is far shorter than what you are describing.

Third, you are ignoring the entire and complex systematized ecosystem of hardware and software that is intertwined in a complex fashion that huge organizations are standardizing on around the world. A slightly better or equal technology will gain no traction in regard.

The only way to upset materially Nvidia’s marketshare is to create a disruptive technology that is so much better you are compelled to go through the cost of disrupting the standard that you have invited millions and millions of dollars in software, hardware, maintenance, and people training.

This applies to AI training. The market is still open for data center AI inference, for autonomous driving, for IoT, as those markets are still nascent and no one has established dominance or a standard in regard, although Nvidia is clearly the company with the lead in all these fields (with Google a strong rival in autonomous vehicles).

So what to watch for is how each of these markets develop. Does Nvidia do to inference what it did to training in the data center? It is likely so, but not a given.

Does Nvidia dominant in autonomous driving? I drones? In IoT, in VR, in AR, we shall see. But each new market is another vertical that has yet to take off and therefore is still ripe for a new dominant player. These are thing we can watch.

Finally, Nvidia is quite frank about the fact that they do not want 100% marketshare. They simply want the vast majority of marketshare that is most profitable. Similar to how they dominant the gaming graphics industry. AMD is probably losing money in GPUs in the low to middle end of the market, whereas Nvidia is making a killing in the high to middle ends of the market. So why produce for the low ends? Leave it to Intel and AMD.

So don’t panic if Nvidia does not have 100% marketshare of everything. But watch as each vertical rolls out and starts to take off and see where Nvidia stands in each, one tornado at a time.



Hi brittlerock

I’m a fairly dedicated follower of technology, as well as being… somewhat knowledgeable about the market (Gregs Rule #1: someone who states they’re knowledgeable about a given market should be completely ignored if they ever opine about said market).

I’ve familiar with almost all of the companies you mentioned, and a decade ago or so, I would agree with you completely. Now… not so much.

What I see in the tech space is some giant tech companies having an insurmountable lead. If their business is tech, I struggle to see any way they could be caught.

Google is an example. Any real threat to their business, they will just buy that company. Facebook is similar. Amazon… well, take AWS. Sure, other suppliers (giant tech companies) can enter the market, but no newcomers can. The giants will split the market as they feel like.

I’m seeing a tech oligopoly forming, without any real potential for a “fall from grace”. So yeah, I think there are “moats so great they can’t be breached” in tech. Any potential competitor is simply purchased before it gets big enough to challenge.

Whether NVDA is one of those…? I’m not sure. The investment to catch up would be more or less insurmountable for almost every other competitor. The only real danger I can see is if Google et al decide to make their own chips, or potentially quantum computers becoming more useful and less error-prone.

The one thing I’m sure of is that the number and complexity of chips in the world tomorrow will greatly surpass the number today. Thats a rising tide that is almost certain to lift the Nvidia boat.

ps. dont forget rule #1!


The only real danger I can see is if Google et al decide to make their own chips…

Google is already doing so. Their custom TPU chip powered chip powers AlphaGo Zero, the strongest Go player in the world. Apparently Google is using their chip in production for several projects, including PageRank.

Nvidia’s real moat is their software. As said earlier in the thread, CUDA is the default language for AI development.

1 Like

Yes, Google has been using its own designed tensor chips. Nvidia’s new flagship chip for inference combines a tensor chip with the GPU. So no real advantage there.

Google is manufacturing the chip (outsourced of course) only for itself. But with Nvidia now producing a tensor chip and with a forward looking roadmap, will Google continue to manufacture its own chip?

It comes down to, how many different technology road maps does Google want to maintain for itself? Google is in multiple high growth very high tech markets, that require a lot of outside technology. For some of these technologies Google has been ahead of the curve and devised and manufactured its own tech.

However, each piece of this tech needs to have a roadmap, R&D, and everything else that Nvidia or another company needs to do to stay relevant moving forward. Does Google want to invest this much effort perpetually for its own internal needs?

I have often heard that many of these insourcing projects are done to give outsource vendors guidance on what Google or AWS or Azure, would like to see available to them. Then they will mostly rely on the outside vendor thereafter.

But who knows, some of the technology is more business critical than others.

But no, I do not see Google’s tensor chip as a discontinuous innovation that will materially impact Nvidia. Google does not sell its chips. Google has its own proprietary and internal software systems. And I am sure Google does not want to build and give away its internal systems so that it can build an infrastructure of software and processes so that competitors can emulate Google.

Regardless, Google is still a Nvidia customer. Just like that are an Arista customer, although they do produce or buy many of their own generic “white” boxes for particular functions.

Quantum computing? Quantum computers are nowhere near anything but lab experiments at this point in time. And quantum computers have limitations on what they can do and what they are good for. When the technology matures it will probably find an under met niche. And then after that niche is filled, if the technology is capable of further maturity then it may stretch its tentacles outward to other functionality. But we are not even to the warm up pitches before the first inning yet in regard to quantum computing.



Google doesn’t sell their tensor chips but they do let people “rent” access to them.
https://ai.google/tools/ They are building the ability to do machine learning into many of their service APIs (IoT, Machine Learning, Etc). They are approaching the market in a fundamentally different way than nvidia but their markets definitely overlap.


1 Like

Google does not want to build and give away its internal systems so that it can build an infrastructure of software and processes so that competitors can emulate Google.

that may be all that is needed to be said about supposed direct Google competition.

It’s too early in the game to be sure about how long Nvidia dominance will last, but the odds are it will last for at least several years. And investing is mostly about probabilities.



There are some conflicting numbers as to marketshare. Not disputed is that Amazon is utterly dominant with around 40% marketshare. The next 3 or 4 companies combined have less marketshare than Amazon.

In some estimates IBM is #3, in some estimates Google is #3, with the other being #4.

Google uses tensors for inferencing. Google states that it next generation Tensor will also be useful for training - tbd. But either way, Google is a small portion of the entire market. And Google is a Nvidia customer.

What I found interesting is that Alibaba, who is a big Nvidia customer, is building out 4 new data centers as we speak and trying to become the other Amazon. In fact, this is from memory, but I believe the number is between 230 to 400 some new data centers being built out in the, forget the time frame, but think 5 years. It is probably a lower number. Of these, a very small percentage will be Google, and again Google will continue to use Nvidia GPUs. It may offer services on its Tensor for specific purposes, or simply as a marketing option.

However, most AI professionals primarily have been trained to use CUDA and the Nvidia infrastructure, and thus will demand that environment. Think about it, if you were a Microsoft person for the desktop, had no experience programming or using Apple, but your cloud service substituted in Apple for Microsoft, would you take the time to learn how to program a niche system, or simply transfer to a competitor that was based on Microsoft? (myself I switched to Apple from Microsoft, and I will never go back unless Apple totally screws up - but I only did so after Apple became much more than a niche product).

Unless there was a real reason to make the switch, you’d simply go to a Microsoft cloud. And Google will have the same issue. Google is a niche. Tensor is a niche. Tensor plus GPU is also a Nvidia product in every data center of any importance in the world that is used for AI training, and as of last count 1200 large customers who are now using it for inference.

As such, I do not see it as much competition. However, if Google starts catching up to Amazon due to overwhelming demand not to use Nvidia solutions and instead use the Google Tensor chip (that as of now is not used for training, just inference, but that may shortly change), then it is something worth taking a closer look at.

If that happens, we will first see it in the declining share price of Nvidia. It won’t be a big crash, but a slow and frustrating descent as Google’s dominance grows and thus diluting the CAP of Nvidia.

Anyways, that is my thoughts on the issue in some more detail as it relates to Google’s Tensor.



While I certainly get the ecosystem argument, I can’t help but think that a comparison to Microsoft is off the market. Windows was a compelling ecosystem because it achieved a dramatically large consumer market share, i.e., the success of Windows created a market which was much larger than anything else for products directed at users of that software. But, with AI applications … which itself is much broader than autonomous driving … the “market” is really the companies which are providing devices that include these functions. If someone came along with a significantly superior solution for any particular AI application, all that would have to change for it to be adopted is a relatively small number of developers … not the consumers, since they deal with the system as developed, not the development.

I am not questioning that, right now, Nvidia has an incredibly strong position … but that ascribing to them the gorilla-like quality of Windows may be ill-advised since a very different technology could well come along that is substantially superior … particularly for some specific application … and that all that would have to shift for that to become the new solution for that specific application is a relatively small developer community.

1 Like

I wonder if sometime, somewhere, Amazon’s take no prisoners mentality in various retail verticals may come back to bite them in AWS. If I were Walmart I would take every opportunity to encourage my vendors to use another cloud entity other than Amazon’s. And if I were a drug, auto parts retailer, grocer, cosmetic shop, I’d be damned if I would use AWS. Just a thought.