Question for NVDA bulls

Supposing if Intel ends up being a serious competitor in the autonomous car market and taking meaningful market share, how does it affect NVDA valuation? I am assuming the bull case for NVDA is, it pretty much owns all the market share in the autonomous car market.

Kingran,

Nvidia being a dominant player in the autonomous car market is clearly an expectation. If Nvidia were to become an also ran in the market it would certainly hurt longer term price appreciation. That is an enormous market, and autonomous vehicles are the largest aspect of edge AI, that also translates back into the data center.

However, those expectations are still years out. NVDA is now selling at forward enterprise value to earnings in the low 30s, if not lower depending on how much higher earnings expectation rise (analyst had NVDA at 5% growth, if you can believe that, and they have had to ratchet up earnings forecasts again).

At this valuation, autonomous vehicles appear to be a twinkle in the eye of the stock, but hardly anything otherwise built into the share price. When autonomous vehicles start rolling off in mass, Nvidia will be the initial leader in the field (even if someone else later overtakes them) and that will start a new stock price surge, that may whither and die a bit if Nvidia thereafter disappoints.

But for now, with the market so nascent, and still in front of us, it will not be an issue for Nvidia until things start to become more mature in the market.

Nvidia may be the standard in the data center, and near monopoly in high end graphics, but the leadership for autonomous vehicles has hardly even started and no guarantee Nvidia will win their as well (in Vegas they would be the odds on favorite no doubt). I just do not see how this future market is baked into the Nvidia share price at this point in time. And again, when autonomous vehicles start to roll out en masse, it won’t matter at the start as the surge will be quite tornadic in terms of adoption for anyone with a material position in the market.

Tinker

3 Likes

Supposing if Intel ends up being a serious competitor in the autonomous car market and taking meaningful market share, how does it affect NVDA valuation? I am assuming the bull case for NVDA is, it pretty much owns all the market share in the autonomous car market.

I don’t think that NVDA’s future is dependent on the autonomous automotive (i.e. automotive is more broad than just cars because it includes trucks, forklifts, robots, or anything else that moves without a human or animal supplying the energy) market. Automotive gets a lot of the press and attention currently, but NVDA is making a lot more money is gaming and datacenter currently. Automotive probably won’t kick in in a big way for several more years.

I would point you to the number of partnerships that NVDA has in the automotive sector. It’s more than 320 partners up from 225 a year ago. How many partnerships does Intel have and who do you think these partners are going to buy brains from when they start producing autonomous machines?

I am invested in NVDA, not for the gaming or even the datacenter business, but for all the brains in a metal box that this company is going to sell. Brains in cars, brains in drones, brains in cameras, brains in anything that will need to take in data through sensor and then make its own decisions. It’s all about the brains, the best brain that can make the fastest and best decisions.

Chris

24 Likes

I just do not see how this future market is baked into the Nvidia share price at this point in time.

I am looking at the multiple difference between INTC and NVDA on their 2019 earnings estimates. Even factoring higher sales growth NVDA is almost valued twice that of INTC, if you think autonomous car market is not in that figure then what is the driver behind the multiple difference?

I am looking at the multiple difference between INTC and NVDA on their 2019 earnings estimates. Even factoring higher sales growth NVDA is almost valued twice that of INTC, if you think autonomous car market is not in that figure then what is the driver behind the multiple difference?

I would compare operating margins and growth rate of revenue and earnings. Also, look at how many employees Intel has compared to revenue (i.e. revenue per employee). I think I posted such a comparison a few months ago.

Chris

1 Like

Nvidia is the dominant market leader in the future of silicon. Intel is the incumbent mostly in business based upon legacy products in slow growth or no growth business.

Intel had to spend something like 100x revenues just to get into the automobile game with a purchase.

Market leaders always have a premium. Growth is a premium. CPU, which of course is not going anywhere, is being displaced by GPUs, in that GPUs enable the usage of fewer CPUs to achieve more results at less cost.

There a host of such reasons, and 30x current earnings is nothing as the AI markets continue to explode over the next 5 years.

Tinker

1 Like

Despite the hype, my understanding of Nvidia’s revenue base has hardly any autonomous auto in it. Yes they are in other aspects of auto (electronic mgmt systems, navigation, infotainment), but whilst there are massive amounts of partnerships there is very little silicon as yet.
A

1 Like

I have doubts about rapid take up of advanced autonomy in cars.

Firstly, the Waymo system or any other using Lidar is too expensive for anything except limited commercial use.

Secondly, legacy car companies are likely to use it as a model differentiator, something to divert the public from realizing that their $50,000 car is really not much better than their $30,000 car. So it will be trickle down

Thirdly car companies rely on surveys and focus groups to tell them what customers want. And consumers are not good at looking ahead, by definition most are mainstream.

Look how long it took things like electric car windows to come to mass market , mid and lower priced cars. And that was something people knew they wanted,they were just not willing to pay for it.
When I was young, car branding was important, Chevy drivers usually would have nothing to do with Fords. Today most cars are generic, and price is the main selling factor.

But past a certain point autopilot will take off in the near vertical part of the TALC. I just think that is several years away so I don’t look on it as a lot of value added to NVDA in this bull market. OTOH it falls into the "almost inevitable "category , better than the “probably” of most investments . And Nvidia’s continuing lead in autopilot falls into the “highly likely” category.

4 Likes

I don’t have any first hand experience with autonomous driving.

I have seen NVDA products work for deep learning.

In less than a week, one of the new hires at my company downloaded all the comments from a popular website, and created a fully automated thesaurus.

Type in car, get out automobile, sedan, etc. (not actual results)

Good, but commercial tools exist for this.

The real value comes in adding multi word phrases.

Type in fast car, get out stock car, NASCAR, Formula One, etc. (not actual results)

He provided examples for three and four word combinations too.

Because it uses actual comments, it includes misspellings and finds new phrases/words as soon as they gain traction.

NVDA is the only tool that he would consider using for this application.

Very Bullish. Full position (represents more than 5% of my investment portfolio).

4 Likes

NVDA is the only tool that he would consider using for this application.

The deep learning SDK’s are not provided by NVDA but by other partners. Of course NVDA provides chips and platform or framework but any tools. So I am bit lost when you say NVDA is the tool preferred by your developer.

I could be completely wrong here, if someone understand this better, I like to understand NVDA software/ App/ Tool ecosystem.

https://developer.nvidia.com/deep-learning-software

NVIDIA Deep Learning SDK
The NVIDIA Deep Learning SDK provides powerful tools and libraries for designing and deploying GPU-accelerated deep learning applications. It includes libraries for deep learning primitives, inference, video analytics, linear algebra, sparse matrices, and multi-GPU communications.

===

Deep Learning Primitives (cuDNN): High-performance building blocks for deep neural network applications including convolutions, activation functions, and tensor transformations
Deep Learning Inference Engine (TensorRT): High-performance deep learning inference runtime for production deployment
Deep Learning for Video Analytics (DeepStream SDK): High-level C++ API and runtime for GPU-accelerated transcoding and deep learning inference
Linear Algebra (cuBLAS): GPU-accelerated BLAS functionality that delivers 6x to 17x faster performance than CPU-only BLAS libraries
Sparse Matrix Operations (cuSPARSE): GPU-accelerated linear algebra subroutines for sparse matrices that deliver up to 8x faster performance than CPU BLAS (MKL), ideal for applications such as natural language processing
Multi-GPU Communication (NCCL): Collective communication routines, such as all-gather, reduce, and broadcast that accelerate multi-GPU deep learning training on up to eight GPUs
The Deep Learning SDK requires CUDA Toolkit, which offers a comprehensive development environment for building new GPU-accelerated deep learning algorithms, and dramatically increasing the performance of existing applications
NVIDIA DIGITS™
NVIDIA DIGITS helps computer vision data scientists and engineers solve complex image classification problems. DIGITS lets you quickly design the best deep neural network (DNN) for your data—interactively, without writing any code—to reach state-of-the-art results with deep learning.

4 Likes

Now how is this different from Intel AI Academy?

What is the difference?

Nobody uses it.

I’m jesting a little here, but I try to constantly stay up to date on the AI/DL world because of my oversized investment in NVDA. I’ve never heard of Intel AI academy which means there’s not a lot of people talking about it. So that’s probably because it’s not in widespread use. IBM just used an NVDA machine to take a Deep Learning task that took 70 minutes a year ago down to 91.5 seconds. Audi took the dataset that took two years for their system to learn road signs and did the same thing on NVDA mainframe in 6 hours.

Does Intel have anything like that kind of performance to tout? What does Intel bring to the table for running deep learning neural networks that Nvidia’s technology doesn’t utterly crush.

6 Likes

I’ve never heard of Intel AI academy which means there’s not a lot of people talking about it.

I am not working on AI but I have known about Intel Academy. So thee it is.

Does Intel have anything like that kind of performance to tout?

I think you should familiarize yourself with Intel offering.

1 Like

Kingran,

Intel offers FPGA and CPUs as its offering for AI. Intel does not offer GPUs and it would be fool hardy for them to do so unless they bought out AMD, if such an acquisition would ever pass muster with the Feds.

Azure is using FPGAs extensively in their data center. FPGAs can do many things, but are also much more technically difficult to work with. To date GPUs have utterly clobbered any thing that Intel has offered in AI.

Do you foresee FPGAs taking any material marketshare from GPUs in regard to AI? In regard to tensors?

If so, let us know, and why.

Tinker

2 Likes

A lot of Intel’s AI offering is CPU based. Through acquisitions they do offer FPGAs. Azure runs CaaS instances on them. Amazon runs FPGA instances from Xilinx. Both these companies are building out CaaS based on NVDA GPUs. Azure recently added Pascal and now Volta. The buildout still has a long runway ahead.

https://blogs.msdn.microsoft.com/uk_faculty_connection/2017/…

This is from Nov 2017. Azure is just starting this out on GPUs. Is NVDA disrupting the FPGAs here? Or the other way around? The total spend on cloud computing for AI is rising rapidly. Nvidia’s Data Center growth rate (105%) far far exceeds that of the general AI data center market growth. Intel “Programmable Solutions” segment, where FPGAs reside, grew at 10.4%.
I think the cloud titans will follow the money and the money wants to run Nvidia.

And it makes sense. Last year it took you 70 minutes to do a task. Now it takes 90 seconds. Bam on to the next thing. How much you can do in that hour you rented matters. And for the Titans it’s important because you charge a premium for what’s in demand.

1 Like

<<<And it makes sense. Last year it took you 70 minutes to do a task. Now it takes 90 seconds. Bam on to the next thing. How much you can do in that hour you rented matters. And for the Titans it’s important because you charge a premium for what’s in demand.>>>

And as we saw from an article I linked to from a third party reviewer who did a detailed breakdown, even Google’s tensor offering (that Google intentionally mis-compared against an equivalent Nvidia GPU) is more expensive to achieve the same result than the use of a Nvidia Volta.

Seems that there is always room for different technologies to optimize different processes, but in general, for most things, nothing yet on the market outperforms Nvidia GPUs and its ecosystem.

The edge, which is still in its earliest stage may be a different matter. That is tbd. But in the data center, the clear leader, and for good reason, is Nvidia.

Tinker

3 Likes

A lot of these GPU alternatives for AI/DL compute acceleration were developed in response to The Tesla K10 Maxwell. Microsoft Brainwave(using Intel FPGA), many of the other FPGAs, Google TPU, etc went into development a few years ago when Maxwell was the best technology. Many Titans have only offered Maxwell until late last year for data center. The Tesla P100 Pascal dramatically improved data center performance and Nvidia became a major player in this space. That was 2 years ago. Only a year later they announced Volta. The goal post was not just moved but shipped to a whole other town.

Take Google’s TPU. Not to poo poo it, but they made lots of headlines when it first came out for being so much better than Nvidia. The problem was they were benchmarking it Maxwell. At the time when they released the report they were comparing it to two generations prior technology. 2 generations but lighyears different in capabilities. All the other tech has to recalibrate to what they are comparing against.

9 Likes

CORRECTION

The 10.4% growth for Intel’s Programmable Solutions (FPGAs) was in Q3 not Q4.

In Q4 that segment grew at 35% yoy. Quite a difference but management gave a caveat on that performance. “Intel admitted that PSG benefited from purchases of “legacy” products that are set to be discontinued, but added the segment also saw “strong double-digit” growth for end-markets such as data centers and cars.“

https://www.google.com/amp/s/www.thestreet.com/amp/story/144…

So the segment growth ballooned to 35% in Q4 but that appears to be customers scooping up the older generation products that Intel probably put in the bargain bin to clear the inventory out.

Will need to see at least Q1 to get a better idea on how Intel’s FPGA acquisition is panning out and the implications for the AI chip market.

3 Likes