NVDA: AI most important technology in history

https://www.bloomberg.com/news/articles/2018-01-25/artificia…

AI is changing the world in ways we cannot imagine. Technology advancements are happening at an increasing and accelerating pace. The brains of computers will surpass those of humans. Software can now write software. AI will hasten the pace of discovery and innovation. The outcome may not be predictable. NVDA controls the building blocks of this technology. I cannot say if there is a limit to the value that NVDA, even at a $150B market cap. Normally I would invest in a company with a market cap this large. I cannot bring myself to sell any NVDA shares. I’ve been suspecting that NVDA uses its technology to advance its business. If it’s not already the “smartest” company in the world, I’m convinced it will be. Intelligence make a competitor stronger and better able to dominate.

Chris

13 Likes

"AI is changing the world in ways we cannot imagine. Technology advancements are happening at an increasing and accelerating pace. The brains of computers will surpass those of humans. Software can now write software. AI will hasten the pace of discovery and innovation. The outcome may not be predictable. NVDA controls the building blocks of this technology. I cannot say if there is a limit to the value that NVDA, even at a $150B market cap. Normally I would invest in a company with a market cap this large. I cannot bring myself to sell any NVDA shares. I’ve been suspecting that NVDA uses its technology to advance its business. If it’s not already the “smartest” company in the world, I’m convinced it will be. Intelligence make a competitor stronger and better able to dominate. "

Consider this my devil’s advocate case.

“Technology advancements are happening at an increasing and accelerating pace”

I dispute this. Moore law is failing. We just lost about 10-20% of compute capacity around the world because of a microcode optimization in most implementations of the x86/x64 (Spectre and Meltdown). AI research started in the 60s (see https://en.wikipedia.org/wiki/AI_winter) and we are just rehashing a lot of old ideas on bigger and faster hardware.

“Software can now write software”

That is a de facto conclusion of the Von Neumann architecture. Registers store code and data with no distinction. Storage stores code and data without distinction. Self modifying code was an optimization dating back to WW2. Self modifying machine learning systems (to the extent they are different from weighted gradient descent optimization exercises) have been self modifying for decades.

“NVDA controls the building blocks of this technology”

They don’t have the best AI software, and they might not even have the best hardware. Maybe they have the best general purpose ANN software and hardware you can buy commercially?

Cloud providers will directly work with fabs like TSMC to develop proprietary processors that outperforms theirs in their particular/specialized cases, INTC will catch up with them, AMD will get a 80/20 solution and undercut their pricing power, self-driving car market will not develop as fast as expected, and the Asian partners will pull an AMSC on them.

18 Likes

Thanks for your opinions, ajm. Time will tell which version of the future plays out…

3 Likes

Funny, but the Asian companies have never AMSCed Intel or AMD CPUs. Given the Nvidia GPUs cost billions to develop, and continue to be a moving target as Nvidia continues to improve their technology, and it is supported by CUDA software, that similar to x86, continues to be modified (yet backward compatible), it seems like an unlikely analogy to AMSC, whose technology was far simpler power transfer technology. Basically, the Chinese utility copies the circuitry and cloned it.

Try to clone the latest Nvidia GPU. If it was so easy, AMD’s latest GPU would not be both slower and more power hungry than Nvidia’s last year technology.

Further, Nvidia has been the world leader in GPU technology for gaming for two decades now. During that period of time, their GPUs were much simpler than they are now. Lots of money to be made however by cloning the GPUs and undercutting Nvidia in Asia, where the fastest growing market for gamers is. In fact, it has become big time sport in certain areas.

If China could not clone Nvidia GPUs to undercut them in legacy gaming technology that was far simpler, good luck in doing so for data centers.

This said, no doubt that ASICs can be built to specific applications in AI and ML. But given all the money Nvidia is making in regard, no one has done so yet. Will start to consider this option when someone finally does.

The tensor, that is not a stand alone technology (at least not yet), as nvidia has incorporated it in their latest SoC.

Tinker

11 Likes

The lead in sentence in the Bloomberg article sums it up extremely well: “Artificial intelligence approached the summit of hype at this year’s World Economic Forum” [Davos]. It’s Arthur C. Clarke’s third law at work, “Any sufficiently advanced technology is indistinguishable from magic.”

Technological growth at Moore’s Law’s speed has been at work long before the invention of the transistor according to Ray Kurzweil’s The Age of the Spiritual Machine. It was happening with mechanical computing devices. There is no reason not to expect it to continue working through other mechanisms or other means that we have yet to invent or even dream of, “future magic.”

“Software can now write software”

That is a de facto conclusion of the Von Neumann architecture. Registers store code and data with no distinction. Storage stores code and data without distinction. Self modifying code was an optimization dating back to WW2. Self modifying machine learning systems (to the extent they are different from weighted gradient descent optimization exercises) have been self modifying for decades.

One of my biggest “aha! moments” in programming was the day I understood how the machine could tell the difference between data and code since they are exactly alike. Early on we used to use “self modifying code” but it is so difficult to debug that with the advent of faster and cheaper hardware the practice was abandoned because ‘wasting hardware cycles and memory’ was cheaper. The practice had nothing to do with AI, it was the product of the programmer’s intelligence.

NVDA is not magic, it just happens to be, currently, a leading edge technology and a darn good investment.

Denny Schlesinger
No position in NVDA, I use high tech ETFs.

The Age of Spiritual Machines: When Computers Exceed Human Intelligence by Ray Kurzweil

https://www.amazon.com/Age-Spiritual-Machines-Computers-Inte…

2 Likes

ASICS like Google’s TPU will not replace something like a Tesla V100 Volta. The Volta, monster that it is, is a deep learning neural network trainer. With CUDA anybody can use it to train their network. It’s use to big data is to rent them out to do just that or to train their own networks. An ASIC by definition couldn’t do this because it is designed to run a specific application like Google Translate for instance. It can be very good and efficient at doing that but it would be literally impossible to plug a different system into and have it run that application.
FPGAs are similar but also different. Again they are efficient at what they are programmed to do but not much else. The advantage with FPGA is that it can be reprogrammed relatively quickly to do another task. Microsoft uses FPGAs to run Bing for example. Amazon runs some things on FPGAs as well. All of them use the Nvidia GPUs to do the training though. Nothing available today and nothing likely to come out shortly can come close to a Volta chip for the beastly computing power that it can dish out. It is the chip that is making AI reach an inflection point. The breakthrough that will(and is) propelling breakthroughs in the whole field. Expanding the AI market by making things possible that which wasn’t possible or practical before.

8 Likes

“Software can now write software”

ajm, I’m glad you “pounced” on some of these points since I found them all too glib. I would add that various sorts of program generators and model-based, “codeless” development tools have been around for quite a while. I myself wrote my first one in 1991 and I wrote a 4GL in 1980 … and I am far from a pioneer. Someone still has to create specifications in some form and ever since the days of putting in programs with switches and wires there has been a degree of abstraction between the form in which the human does the specification and the code which actually executes.

1 Like

In the 90s I was on a team reviewing some software that would emulate a human inputting data to an online application. The value of this s/w was in the realm of testing which is (or was) pretty labor intensive.

In speaking with their head coder I learned he had gained his chops while working on embedded code for military hardware. Specifically he wrote the code for the guidance system of first generation cruise missiles developed in the 80s. In those days everything was discreet chips. There were no ASICs, SoCs, etc. Due more to weight considerations than anything else, the amount of memory available was extremely constrained, I don’t remember exactly, but 8K sticks in my brain.

A missile guidance system must react in real time to sensor data. Wind, altitude, barometric pressure, can all influence the flight path and targeting, and there’s always the threat of unknown obstacles in the flight path as these missiles hug the terrain. The code he wrote had to keep track of the content in every slot in the internal stack as it was necessary to rewrite the stack as the flight of the missile progressed.

In other words, he wrote code that rewrote itself during execution. I don’t know if this was the earliest example of code that modified itself based on real time inputs, but it was certainly the first example I had heard of. So code that writes code goes back at least to the 80s.

It’s kind of interesting to note that the requirement for self modifying code was the product of constrained computing power as opposed to the availability of enormous computing power in a lightweight package.

8 Likes

I don’t know if this was the earliest example of code that modified itself based on real time inputs,

Why would code modified itself except for changing conditions? Be the input “real time” or human input makes little or no difference.

Due more to weight considerations than anything else, the amount of memory available was extremely constrained, I don’t remember exactly, but 8K sticks in my brain.

Right, you only write code that modifies itself (complex) if you have no simpler alternative. My second “mainframe” had all of 4K memory. You really had to optimize and compress code. :frowning:

Denny Schlesinger

If AI is the most important technology in history (I believe it is), then we should all own the leaders in the AI space. In a very rough order (although I’m not super familiar with the Chinese companies), those leaders (IMHO) would be:

  1. Google
  2. Baidu
  3. Microsoft
  4. Amazon
  5. Facebook
  6. Alibaba, IBM…

Maybe some others. I don’t really see NVDA as an AI play, although the AI side of their business is obviously a large kicker for them. The power of AI is in the software rather than hardware I believe. So NVDA would benefit, perhaps hugely from the AI push as it is currently, but the leaders in the software space will have the most benefit.

You really only need NVDA power to train the models. Once trained, actually using them requires very little power. I believe that AlphaGo, for example, would run on a mobile phone now, although it was trained on a large number of CPUs.

For example, what if Google started patenting discoveries their AI makes, in vastly different realms? New materials technology? Better aircraft designs? more efficient steel smelters, more effective drugs… anything that human intelligence has given us, AI has the potential to improve, perhaps by orders of magnitude.

So the big winners I believe will be those big companies as they apply increased intelligence to … well… everything. Any promising startups in the space will be bought by the companies above.

After that rant, I probably need to allocate some cash to all those companies!

cheers
Greg

15 Likes

If AI is the most important technology in history (I believe it is), then we should all own the leaders in the AI space. In a very rough order (although I’m not super familiar with the Chinese companies), those leaders (IMHO) would be:
1. Google
2. Baidu
3. Microsoft
4. Amazon
5. Facebook
6. Alibaba, IBM

I Thought Wayne Gretzky said to skate where the puck is going, and not where it is now. Time after time Wall Street tries to sell us the latest greatest thing just before it crumbles into dust with our money, and something new takes its place promising even more.

b&w

4 Likes

1. Google
2. Baidu
3. Microsoft
4. Amazon
5. Facebook
6. Alibaba, IBM…

NVDA supplies them ALL!!

NVDA is the arms supplier for AI.

NVDA invested 8000 engineering YEARS into its Xavier chip which is for autonomous vehicles. How that for advanced hardware! They now have 320 partners in the automotive sector, up from 225 just 8 months ago!!! That’s just one industry that will use AI. Every industry will use it.

Chris

14 Likes

Chris, you did not mention that Nvidia has all invested more time and money into software than any other company out there as well, not to mention all the aspects of creating the whole product.

Tinker

1 Like

Chris, you did not mention that Nvidia has all invested more time and money into software than any other company out there as well, not to mention all the aspects of creating the whole product.

Tinker, I actually considered adding that but figured it wasn’t necessary to make the point. As I’ve said many times before, I’m in this $150B company because the upper bounds of market cap I cannot imagine but I would not be terribly surprised if NVDA eventually becomes the world’s most valuable company. What company can compete without electricity? AI will be equally important to a company’s ability to compete. NVDA has primed the pump with Volta; companies can buy build it, buy it, or rent it in the cloud; any company can now use it and I think most will. For the cloud rental method, NVDA is the source while AMZN, MSFT and ALL the other cloud providers are middle men.

Chris

4 Likes

I cannot imagine but I would not be terribly surprised if NVDA eventually becomes the world’s most valuable company. What company can compete without electricity? AI will be equally important to a company’s ability to compete.

I literally just got done writing two posts about how I woefully underestimate the potential of stocks to swing, but the most valuable company in the world??? I’m sorry, I don’t see it. They will have all of 8 or 9 billion dollars of revenue in 2017. I’m not saying they won’t continue to do well, and heck maybe you’re right, but I mean, they aren’t the only horse to bet on!

Bear

1 Like

I agree with Bear. It is too early to make such a prediction. They do seem to have become the de facto standard in the data center, certainly in gaming graphics. In those two settings, yes, things seem good. But the pecking order of AI on the edge has yet to be written. This is things like autonomous driving, drones, IoT.

Further complicating matters, is unlike past times, today, we have 4 or 5 companies, mostly 4 as IBM really seems not to count much in regard to standard setting, who are actually large enough, and focused enough, and talented enough, to create their own fundamental disruptions in the AI technology.

This said, 10 years from now, if all goes well…maybe not Apple in value but certainly Intel at its peak is not outside of the bounds of possibility, adjusted for inflation upward of course.

The same claim was made about QCOM in 2000 by an analyst stating that QCOM would be the first trillion dollar company. What the analyst got wrong was that there were limits to the 3G market and that QCOM would not dominate in any special manner 4G. 5G is coming up, and as posted on NPI there appears to be a policy slide show advocating that Trump federalize the roll out of 5G so as to roll it out nation wide within 3 years.

Is the document real? Dunno. But even the rumor of it may accelerate the roll out by the incumbents to avoid any such thing.

The faster 5G rolls out, the better it is for Nvidia, as 5G enables the IoT economy on the edge, as well as autonomous driving.

The faster these things mature, the harder it will be for some alternative solutions to compete effectively with Nvidia, once Nvidia becomes a market standard, because the market needs a standard.

Interesting times to come.

Tinker

5 Likes

… 5G enables the IoT economy on the edge, as well as autonomous driving.

Let’s not get carried away with the hype.
I don’t think 5G “enables” autonomous driving.

Fully autonomous cars have to work in all cases, even if the cellular network has a poor connection, is overloaded or even failed due to whatever reason. Otherwise all traffic comes to a standstill in the worst case of gridlock ever seen. Cars need to be safe and still functional without a connection, IMO as well as what I’ve heard from numerous industry speakers at conferences.

Imagine a bad actor taking out a cell tower (physically or electronic jamming) and the subsequent chain of events due to the gridlock. I think the idea of “autonomous” cars is that they need to be operable in an autonomous manner, even if they might check in for software or data updates every so often.

Mike

9 Likes