Morgan Stanley disappointed with NVIDIA RTX

“We are surprised that the 2080 is only slightly better than the 1080ti, which has been available for over a year and is slightly less expensive,” he said. “With higher clock speeds, higher core count, and 40% higher memory bandwidth, we had expected a bigger boost.”

As a result he expects the adoption of Nvidia’s new products to be “slower” and doesn’t expect “much upside” from the company’s gaming business in its next two financial quarters.

https://www.cnbc.com/2018/09/20/nvidia-falls-after-morgan-st…

I suppose there is a somewhat lower performance gap than between Maxwell and Pascal.
The RTX 2080 outperforms the GTX 1080 by a margin of approximately 30%.
The GTX 1080 outperformed the GTX 980 by a margin of approximately 40%.
This is all very approximate, and it doesn’t take into account the “DLSS” functionality which adds enormous uncertainty to any current comparison.
Once we see DLSS in widespread practical application, the calculation may shift substantially in the favor of the new Turing chips.

3 Likes

I think this reaction to the Morgan Stanley disappointment could actually present a compelling entry point into NVDA (even though Saul sold his try-out position yesterday).

I had a friend mention the test results yesterday, which prompted me to post this thread over on the NPI board, where I concluded that NVDA management knows what they’re doing and knows their customer.

I don’t expect that the market for the highest end graphics chips will be considering things in terms that price should scale linearly with performance, particularly when the measurement is on a scale that doesn’t even account for the fact that the old chips cannot do ray tracing.

Apologies for this tiny bit of options, but March 2019 expirations should provide time for a full quarter of actual sales of the new RTX 20-series chips to show up in the earnings report…the quarter which includes Christmas, by the way.

-volfan84
long NVDA

2 Likes

Hasn’t that been Apple’s M.O. for years now? Making incremental improvements in their phones while still charging a premium?

How’s that strategy working out for them these past few years? Pretty good, I’d say.

Dominic
Long NVDA

Hasn’t that been Apple’s M.O. for years now? Making incremental improvements in their phones while still charging a premium?

Nvidia doesn’t have the brand position that Apple has. No one does.
If gamers think that AMD is offering higher-performing chips and/or better value, they will switch in a heartbeat.
Nvidia has actually made itself somewhat unpopular among gamers through its insistence on the proprietary G-sync standard (which requires paying hundreds of dollars more for a compatible monitor) and its refusal to support AMD’s open freesync standard.

However, there doesn’t appear to be any hope that AMD is going to be offering anything truly competitive (with the 2080), especially because of the much-hyped “ray tracing”. Between that, DLSS and the end of the crypto boom, I suspect AMD is in for a hard time.

5 Likes

There is a big problem when they rate the RTX without giving due credit to the Ray Tracing capability.

Right now you can hook up the RTX in a gaming rig and not get ray tracing! Why? Because Microsoft has not completed the Direct X API function for it called RDX! That is due in October. So right now you have the RTX simply waiting for the RDX before it’s go time! RDX has been available to developers since March so that games like BF5 could be ready to incorporate it this fall. Blame Microsoft for not having RDX ready when RTX was made available.

I mean sheesh! Who ‘analyzed’ this GPU? Some adult still using a flip-phone?

Let me be clear, no adequate benchmarks can be done right now until RDX is available to do so. And when that happens refer to Ars and Gamer’s Nexus for the correct review.

So, IMO, right now is a perfect time to get into NVDA since some analyst is disappointed with the performance and didn’t bother to know why.

When the RTX benchmarks come out with ray tracing equipped, be ready for some crow eating…just sayin’.

¬Scott
gamer handle: AzzKikr

26 Likes

Eurogamer has what I believe the only benchmark review with the DLSS technology through some demos. Still no RTX yet.

DLSS is the reason for the Tensor cores, it uses neural netoworks to do some graphical voodoo, you know to be technical. Nvidia has this sweet Saturn V super computer with hundreds of DGX-1 AI computers and thousands of Volta V100. I did a post a few months back called “how Nvidia does it” talking about it. Now they are putting that into action. A gaming software company can partner with Nvidia and Nvidia does basically all the work. Nvidia created a vast neural network in Saturn that can take a game and create millions of images with it and then run the DL neural network on it to deconstruct the images so they can be taken from a lower resolution setting to a higher resolution. Then another neural network is written (by Saturn) that can infer how to reconstruct the image to near or better than higher resolution(say 4K). This new neural network unique to that game can be containerized and provided as a game update or patch. A game running at lower settings can produce much faster image generation (frames per second) than one running at higher settings. The result is that DLSS produces equivalent quality at a higher refresh rate than natively could be produced straight to 4K. The time it takes to generate a lower quality frame reconstruct it and produce an AI enhanced equivalent frame of a higher resolution is less (much less) than it is to render that frame straight to the higher resolution.

This is why with your Turing enabled RTX with the patch enabled game the performance of the RTX GPUs really shines. On DLSS capable demos here’s what they found.

https://www.eurogamer.net/articles/digitalfoundry-2018-9-19-…

From the Final Fantasy 15 Demo:

RTX 2080 performance with standard TAA(native higher resolution reveals that the card enjoys a straight 30 per cent lead over GTX 1080, and it’s basically on par with the GTX 1080 Ti - a state of affairs that’s fairly common in the standard benchmarks to come. DLSS grants the RTX 2080 a further 39.5 per cent of raw performance, which clearly takes it well beyond the capabilities of even the most powerful Pascal cards. With DLSS active, the RTX 2080 offers an 81 per cent boost over GTX 1080. And once again we see that the RTX 2080 with DLSS enabled outperforms the 2080 Ti running on standard TAA. With RTX 2080 results this good, the impact on RTX 2080 Ti is even more profound. Again, with DLSS active, it’s capable of delivering 80 per cent more performance than the GTX 1080 Ti, which does not have access to this technology.

It’s important to note that these are only demos, not full gameplay. However, 28 games are already slated to release such patches for enabling DLSS. This will be coming sooner than later and there will be more benchmarking for DLSS. The first glance here looks really great. And since it’s all AI and DL, it will probably improve fairly quickly, because that’s one of the key things about AI is that it learns. Ray Tracing will probably also be pretty stunning but DLSS will be the first new technology that will show whether or not the Turing architecture will meet high expectations.

In this review they talk more about image quality of the DLSS, which basically concludes that it’s as good if not better than doing it straight to the higher resolution at a much better performance.

https://www.eurogamer.net/articles/digitalfoundry-2018-dlss-…

It will be interesting if the Tegra line gets a Turing type treatment. Ramping up the resolution capabilities for consoles like the Switch. Could other consoles want in on that?

Darth

5 Likes

Darth,

Does this mean, given the software component, that an AMD GPU is unable to even play these games?

Tinker

No. AMD could play the games as long as it meets all of the games minimum requirements, which the newer AMD cards can play all games as far as I know. A DLSS update or a patch to the game would not affect the ability to play the game without DLSS. That would be something probably in the game settings on or off.

The AMD cards on the market can’t handle 4K and don’t have as good performance at other high resolution settings as the higher 10 series and of course 20 series. I doubt when the DLSS benchmarks arrive, there will be many comparisons with AMD. You’ll only see 1080 and 1080ti maybe 1070ti. For now, at this level Nvidia is only competing with Nvidia.

5 Likes

The AMD cards on the market can’t handle 4K and don’t have as good performance at other high resolution settings as the higher 10 series and of course 20 series

The newest top-line AMD Vega 64 is about on par with the GTX 1080 (and a bit cheaper), but far behind the RTX 2080 and of course it has no raytracing and DLSS.

https://www.eurogamer.net/articles/digitalfoundry-2018-09-20…

I have no idea how important raytracing is actually going to be.
I know that several years ago, I selected Nvidia over AMD because of the Physx engine which added nice effects to Starcraft, which was essentially the only game I was playing at the time.

Apologies for posting this in 2 different threads, but it is directly relevant to the Morgan Stanley “disappointment”.

Per this Mark Hibben article, MS was “wrong, wrong, wrong”.

https://seekingalpha.com/article/4208942-c-j-muse-gets-nvidi…

-volfan84
long NVDA

Per this Mark Hibben article, MS was “wrong, wrong, wrong”.

I looked some more at DLSS, and while nobody has anything definite, it certainly seems as if it could offer a considerable frame rate boost over the previous generation, at (for most games) a slight decline in image quality. Or to put it differently, for games that challenge whatever GPU you have, it could yield a substantial improvement in image quality at the same FPS.
In practical terms, it might be most important in the mid-end and value segments, where AMD is.
It seems unlikely that ray-tracing is going to be very important for the RTX 2050, because it likely won’t be able to get good frame rates at even a 2560x1440 resolution with ray-tracing enabled. Unless, perhaps, with DLSS.

Quote in the OP, emphasis added:

“We are surprised that the 2080 is only slightly better than the 1080ti, which has been available for over a year and is slightly less expensive,” he said. “With higher clock speeds, higher core count, and 40% higher memory bandwidth, we had expected a bigger boost.”

As a result he expects the adoption of Nvidia’s new products to be “slower” and doesn’t expect “much upside” from the company’s gaming business in its next two financial quarters.

I’m in no position to asses the technical qualifications of the 2080 nor do I think it matters much as it only addresses the near term (next two financial quarters) – the little picture. The big picture, and my investment thesis, is that gaming is Nvidia’s past and AI everywhere is Nvidia’s future. In biotech and drugs they put a lot of emphasis on “pipelines,” in Gorilla Gaming on “The Bowling Alley,” both indications of future markets. The trader might be interested in the “next two financial quarters.” The buy and hold investor better look to a farther future.

A consensus is building that the serial von Neumann architecture and CPUs are giving way to massively parallel computing and GPUs. At first I thought this would happen near term only at the core but AI inference is moving it to the edge, to IoT devices, potentially a huge market, the kind of market that ARM grabbed and Intel missed. It’s hard to think of any area where more intelligence is not an advantage.

Unless I’m fatally wrong, GPUs and Nvidia have one or more decade long runways, quarterly earning results more noise than substance and potential buying opportunities. NVDA is most certainly a high conviction investment for me.

Denny Schlesinger

8 Likes

I’m in no position to asses the technical qualifications of the 2080 nor do I think it matters much as it only addresses the near term (next two financial quarters) – the little picture.

The GPU cycle is 2 years, and I think you’re underestimating the importance of the GPU business, which still appears to be the biggest source of revenue and profits at NVDA.

1 Like

The GPU cycle is 2 years, and I think you’re underestimating the importance of the GPU business, which still appears to be the biggest source of revenue and profits at NVDA.

Unstated and underestimated are two different things. I didn’t dwell on details. Clearly at this point in time gaming is still Nvidia’s biggest business. LTBH is not dependent on the past but on the future which was the central point of my post.

Denny Schlesinger

6 Likes