Long Live Nvidia!

Cryptocurrency demand may be fading (it is said, and likely so), however, the next jump in graphics technology is on its way, ray tracing.


Ray tracing is photo realistic graphics that actually traces light patterns as they illuminate and object and heretofore has been impossible due to lack of horsepower. Not anymore. Here comes Nvidia Volta along with Microsoft Direct X 12.

So for those thinking gaming graphics is a dud moving forward (don’t know who is saying that, but this should disprove that notion if anyone is), ray tracing will create an entire new upgrade cycle for all high end gamers over the next few years, IMO.

The graphics rendered are so great, that games will be designed to play it, and computers will be designed to be able to play it. Nvidia is the only company with a chip to enable it at present.

Well, so much for the death of Nvidia with cryptocurrency concerns.

Still perplexed if AMD has not totally crashed today, but Nvidia is down less than 4% anyways.

Long live Nvidia! No need to chant said as Nvidia seems quite able to take care of this themselves.



I got out of NVDA too early, but my reasons for investing were never tied to Crypto.

-DC: via cloud instances in Azure/AWS of NVIDIA GPUs, and on-prem hardware (attach to industry standard servers and/or DGX-1 solutions)
-Autonomous cars: still ahead of the money materializing, but they still appear a major player in terms of positioning
-AI: tied to DC comment a bit, but basically means more the use cases are everywhere - health, biotech, ML/DL
-Gaming/esports: growing like crazy. Next trend seems to be streaming video gaming, which NVIDIA was talking about and planning to launch a couple years ago, leveraging NVIDIA Shield/TV.

Plus AMD can’t get out of their own way, Intel is playing catch-up. I think the biggest competitors is the cloud titans coming up with a competitive play via their own cloud services, or some start-up we haven’t heard of that figures out some new tech or quantum computing, or something that leapfrogs what the GPU can do.

Great company!


I think the biggest competitors is the cloud titans coming up with a competitive play via their own cloud services

In perusing an article about 5G last night and thinking about how much Tinker has harped on latency prompted me to think back to NVDA announcing their GeForce Now offering to allow people to essentially play computer games with the power of a GTX1080 (or whatever the top-end graphics card is) from whatever computer they had available. The reduction of latency that will be present with 5G may change the computing paradigm considerably, such that much computing could be done in the cloud.

It seems to me that NVDA is pretty well-positioned to play a pretty huge role both with cloud-computing (see: last 4 to 6 quarters’ revenue jumps for their data center segment) and edge computing (see: autonomous vehicles and drones as 2 edge applications). Their advantages in speed and lower power consumption are pretty decent differentiators. Specialized ASICs or FPGAs would seem to be the biggest threats for specific applications, but if NVDA can maintain their present lead with GPUs, they should still have some decent runway ahead.



article above is along your lines of thinking, and the cloud titan comment I made.
Question is, and not detailed enough in this short article to discern, will NVIDIA GPUs in Azure cloud be foundation of this Microsoft gaming service (so both MSFT and NVDA win if it takes off) or will it be supported by non-NVIDIA GPUs or something from Intel, etc etc…

The reduction of latency that will be present with 5G may change the computing paradigm considerably, such that much computing could be done in the cloud. Yes, could be very big. What counts is the user perceptions of lag, which is higher in games than other uses. Games can be designed around that, and it is only critical in action games.

I used to play with various flight simulators, and finally gave up because of the constant downloaded “fixes” and demands for ever more expensive GPU, most of which are not available on Apple machines.

I took advantage of recent price drops to buy a little more NVDA today.
I see nothing in today’s news other than the price drops themselves that relate to any stock I own.


Relating to this ray-tracing that NVDA is just releasing, here is a link to NVDA’s own blog about the announcement:

NVDA’s RTX webpage:


There are three legitimate concerns about NVDA: (1) valuation (but any great, dominant, rapidly growing company is said to have valuation issues), (2) substitution for GPU in AI and machine learning (Google tensor for example, although that is a solution that still cannot equal NVDA latest GPUs in price/performance as was demonstrated from the pricing scheme Google uses, and Google using trickery to make the tensor look faster than the Nvidia GPU (no, I am not going to rehash that, I have already had that discussion in detail on the NPI board first, and then a few weeks later or so on this board), and (3) Gaming. Just how long and how fast can gaming grow?

Well, assuming this stuff NVDA is speaking about can be made into the next generation gaming platform for graphics, it will eventually call for the entire inventory of mainstream GPUs to be upgraded. Not all at once, at first only the top end of gamers, but like every other graphics technology, it will roll down to the mainstream and be everywhere a minimal requirement for all but the lowest computers such as Chromebooks, or perhaps MacBooks (hey, Apple not the best at maintaining the best graphics standards, although they maximize what they got).

But this ray stuff is such a large jump up as to be (yes I think it literally is) a discontinuous innovation.

Discontinuous innovations that go mainstream are something to behold. They replace the infrastructure that preceded them because the price/performance value they create simply cannot be met by the existing infrastructure. Telephone to telegraph, dial up to DSL, etc.

So, so much for worrying about #3, and most of us had little problem with #1, as long as #3 did not crash, and #2 we were on more solid ground with I think.

Between Mongo and NVDA…well I might do more stuff. Again though, Mongo, too early to tell for sure, but I am too old (and still too young) to wait to get excited about stuff anymore, but yet young enough to make up for any stupid mistakes (which of course is never again…more famous last words).

Yeah, this is undeniably exciting stuff for NVDA and as profound as NVDA in AI in my opinion. This is big stuff mid to long-term for NVDA. I don’t know when this stuff will start to roll out, but certainly with in the next year or two one would think, at least at the very high end. Perhaps roll out on serves in the streaming service with those with the bandwidth. What, 100 mb/sec enough? I actually have that now with Comcast, much of the time. Probably requires higher bandwidth on a consistent basis, like 5G.



Discontinuous innovations that go mainstream are something to behold.

Although, here one must remember that not everyone has a need for advanced graphics even close to this. Gaming, CAD, and a few other applications, sure, but not Word and Excel.

1 Like

We’ve seen this same assertion about advanced technologies over and over. It’s being discussed here with respect to NVDA. It’s also applicable to ANET, PSTG, MDB, AYX and others. But at some point you have stop and ask yourself when is a technology good enough?

What I mean by that is there a limit to the improvements that the majority of people who are in the available market space are willing to pay for. Isn’t there a point where the average gamer says, “It’s good enough. I’m not going to plunk a few hundred more dollars for an imperceptible improvement in performance, graphic quality, etc.”

CDs are sampled at 44.1Khz, each sample is 16 bits. Those numbers were not arbitrary. Human hearing is approximately 20Hz - 20Khz. Sampling at a bit more than double the range of human hearing allows for flawless reproduction of the full audio spectrum. 16 bit samples allow for a dynamic range of 120db (with dithering). That covers loudness reproduction from the threshold of hearing to the threshold of pain (unless you’ve toasted some of your hearing by standing in front of speaker cabinets being driven by outsized amplifiers playing in a R&R band - that would be yours truly).

The audio of a DVD contains 24 bit samples at 192Khz. The argument is that even though this allows for the production of audio far in excess of the range of human hearing the “headroom” (audio talk for the wasted range) provides for higher quality audio production - only few, if anyone can actually hear the quality improvements. The fact of the matter is that most of us are happy with an MP3 (considerably lower audio quality than a CD) or even a still lower quality audio stream from Spotify, Pandora or similar.

I think Nvidia is aware that their cash cow of gaming graphics is near its high end limit. Fortunately, they have several alternative avenues for future growth. But at some point it becomes pointless to improve the rendering of consumer level computer graphics. I think we are approaching the upper end of the market.


<<<Isn’t there a point where the average gamer says, “It’s good enough. I’m not going to plunk a few hundred more dollars for an imperceptible improvement in performance, graphic quality, etc.”>>.

Sure is, but we certainly have not reached it for graphics. Gaming, virtual reality, large screens, 5G coming, this new graphic standard is so good that it blows the existing high end graphics away.

Perhaps after this level of graphics matures, then yes, yes indeed, it will be good enough. That will be multiple generations away.

Further, such graphics will need to become more and more efficient so it can move down to smaller devices over time.

So yes to it gets good enough, no to that issue plaguing Nvidia at this point in time.

Its tech is nowhere good enough in graphics yes, no where good enough in AI yet. Both have many upgrade generations to go, minimum. Unless something disruptive comes to market and starts to replace GPUs and Nvidia either has not invented it nor bought it.


1 Like