NVDA: AMD's new gaming GPUs vs NVDA GPUs

https://www.thestreet.com/story/14271307/1/reviews-for-amd-s…

In a 15-game test, PC Gamer found that AMD’s RX Vega 64 flagship gaming GPU slightly underperformed Nvidia’s GeForce GTX 1080 GPU, which is based on the Pascal architecture launched in 2016 and like the Vega 64 carries a $499 graphics card MSRP. For 4K-resolution (3840x2160) games, the GTX 1080 turned in an average score of 47.3 frames per second (fps) to the Vega 64’s 43.3. For 1440p (2560x1440) gameplay, the GTX 1080 averaged 77.1 fps to the Vega 64’s 70.7. In addition, though AMD has touted the high minimum frame rates Vega GPUs deliver, the GTX 1080 had a small edge in that department as well.

PC World, meanwhile, got its hands on the liquid-cooled version of the Vega 64, which sells for $699 as part of a “Radeon Pack” bundle that also features $300 worth of CPU and monitor discounts and (in some markets) two free games. With the product typically providing just a 5% to 6% performance gain relative to an air-cooled Vega 64 – its performance is called “essentially GTX 1080-level” – and the test unit suffering from a “buzzing coil whine” when games aren’t being played, PC World isn’t a fan.

Importantly, for both air and liquid-cooled Vega 64 cards, performance is only part of the story. As previously indicated by a disclosed max power draw (TDP) of 295 watts – far more than the 1080’s 180 watts – the Vega 64 is very power-hungry. AnandTech found a test system running the air-cooled Vega 64 consumed 459 watts of power while playing Battlefield 1. That easily beat the 310 watts consumed by a 1080 test system, as well as the 379 watts consumed by a system running Nvidia’s GTX 1080 Ti ($699 MSRP), whose as expected outperformed the Vega 64 in benchmarks.

And with higher power consumption comes higher system noise, since a graphics card’s fans have to spin faster to dissipate heat. An air-cooled Vega 64 system tested by AnandTech produced 56.4 decibels of noise while playing Battlefield 1, while a 1080 system produced 49.4 decibels. Naturally, Vega’s power and noise disadvantage relative to Pascal also stands to be a disadvantage in the notebook GPU market when the first notebook Vega parts arrive. Especially with Nvidia having rolled out its Max-Q platform for enabling thin-and-light notebooks with powerful GPUs.

2 Likes

Yep,

AMD simply cannot keep up with the product cycle. The NVDA chips it is competing with are more than one year old. Even with more than a year and a target to shoot for AMD could not pull it off and it will pale in comparison to NVDA’s next offering.

So who else competes with NVDA in GPUs? No one.

Look to next year for the next NVDA “killer”, which will be some sort of disruptive technology as there is no competition in GPUs. If someone is going to take NVDA’s business it is going to be with a substitute that is not a GPU.

Tinker

1 Like

AMD simply cannot keep up with the product cycle. The NVDA chips it is competing with are more than one year old. Even with more than a year and a target to shoot for AMD could not pull it off and it will pale in comparison to NVDA’s next offering.

Yes, that’s what will happen.

Look to next year for the next NVDA “killer”, which will be some sort of disruptive technology as there is no competition in GPUs. If someone is going to take NVDA’s business it is going to be with a substitute that is not a GPU.

Yes, this could be but it will take many years for any new technology to be perfected. On top of that, it would need to be a “complete” product solution and not just some new type of processor. A complete solution would include software, a legion of developers, manufacturing at scale, support, integration with existing technology (e.g. CPUs or GPUs). After that application demonstration would be needed before any sales traction can be expected. Any substitute that might compete would also need to have a least a 10x improvement (less than that and it won’t be adopted by many) over GPUs and GPU technology is a moving target as GPUs are improving 10x every 2 years. Bottom line: substitute technology will be spotted a mile away (years prior to them being any threat to NVDA).

In the meantime, NVDA will continue to advance its technology. NVDA is now dominating the data center market and it is beginning to target the edge. The product is Jetson and it enables AI to be put in smaller external stand-alone devices (i.e. not in big data centers in the cloud). The GPUs in Jetson GPUs are about a year behind the versions that are in the data centers so Jetson chips are now using the Pascal architecture (1 generation behind Volta). When data center GPUs moved from Pascal to Volta, Jetson GPUs moved from Maxwell to Pascal. I will be looking for Volta to go into Jetson next. I think we can expect this to continue going forward. The size of the Jetson GPUs is about credit card sized which is small enough for many devices and applications but further miniaturization will open up many more applications for smart devices at the edge. Getting lower power consumption specs for Jetson products may also be an important improvement and open up even more applications.

The one thing that’s been nagging me a little bit about NVDA is the scenario where GPU technology becomes good enough such that future improvements are no longer demanded or necessary. I know where not near that now but the need for further improvement could be application dependent. If we were to reach the point where further GPU improvement is no longer a differentiating factor for customers for specific application then we could begin to the competition catch up and the GPUs to become commoditized. Thoughts on that?

Chris

4 Likes

Chris,

My thoughts in regard to commoditization is look at Siri. Siri still is not all that great despite a company with nearly $200 billion in cash just sitting there to be invested in it. AI has a long way to go. And from the data center there is the edge that has barely begun.

And as we try to make these predictions, no one has any idea in the end where AI is going to take us, if anywhere. Now if AI turns out to be overhyped like so many other technology trends, then so will go NVDA. But if AI is real, and it seems to be as real as Moore’s made the PC they moved into mobile with the ARM disruption and internet disruptions, then we have many years to go, one tornado to another.

According to ANET we are just now starting to move to 100GB connections from what was 10GB as the standard in the data center. ANET is now working on 400GB. Somewhere in there good enough may become the thing, but that is not the case now. And by that point in time, look to Intel. Intel has the good enough issue with CPUs, but due to backward compatibility and AMD not able to keep up with product cycles, it now maintains a 99% market share all while its CPUs are now in the good enough phase of the market.

Tinker

1 Like

Yes, Tinker, I agree that we are nowhere near commoditization of GPUs and I agree that commoditization may never occur. But if it does then we will start to see it show up with both declining gross margins and declining market share for NVDA. That will be the evidence. For now the thirst for faster and more efficient parallel processing GPUs is immense.

Regarding the question of whether AI is real, I am a believer that it is. I think it’s as clear as day. Why do I see it that way?

  1. I’ve been following the Nvidia blog for a couple of months now and I am made aware of new applications for AI almost daily. Some of those articles and podcasts are very impressive. I see AI being applied to more and more problems.

  2. The mass adoption of NVDA’s GPUs by the dominant cloud providers and ICPs for AI applications tells me AI is here is mass and it’s here to stay. Amazon, Microsoft, Baidu, Tencent, Facebook, Google are all standardizing on Volta and they are using it for AI for themselves and for their cloud customers. These cloud leaders would not be investing hundreds of millions of dollars if they don’t already have demand for AI from their customers.

  3. AI is a prerequisite for self driving vehicles and NVDA is already partnered with 225 companies int he automotive space.

AI is greatly accelerating the pace of innovation and technology advancement and NVDA seems to be the sole supplier arms dealer to make this revolution possible.

Chris

8 Likes

NVDA seems to be the sole supplier arms dealer to make this revolution possible.

It is for now, and yes it is become the standard upon which this is made. Unlike when Intel became the standard before it, there was no Amazon or Google or Apple or the like to get in its way.

This said, yes, the best investments are in those companies that are clearly singular, and so is Nvidia and so it has been as an investment so far in the sphere of AI and as you well put it, shall be for years to come as a technology to disrupt the standard of the AI is something that we will see, although we may not appreciate it at first, if it should arise.

Tinker

1 Like

AMD has been chasing NVDA basically since they acquired ATI. NVDA did the smart thing and really focused on the GeForce drivers and maintaining compatibility which they are continuing to do in their server systems.

The one thing that’s been nagging me a little bit about NVDA is the scenario where GPU technology becomes good enough such that future improvements are no longer demanded or necessary.

With the growing applications I agree this will be several years down the road. But tech has a habit of accelerating its own development. How long did it take to go from simple transistors to integrated circuits to CPUs and then to GPUs? My rough estimate is 50 years, 30 years, 20 years. At some point we will meet physical limitations on GPU development, and at the same time the AI enabled by GPUs will quicken the development of other technologies. Will it be only 10 years before quantum or biologic computing becomes mainstream?

Maybe AMD is satisfied chronically playing second fiddle. At least they look better positioned than Intel, whose AI seems destined to the fate of 3dfx, who was the leader in consumer 3D graphics acceleration but got passed up by NVDA and ATI.

2 Likes

The one thing that’s been nagging me a little bit about NVDA is the scenario where GPU technology becomes good enough such that future improvements are no longer demanded or necessary. I know where not near that now but the need for further improvement could be application dependent. If we were to reach the point where further GPU improvement is no longer a differentiating factor for customers for specific application then we could begin to the competition catch up and the GPUs to become commoditized. Thoughts on that?

Interesting thought, Chris. Perhaps what will limit commoditization will be the need to be backward compatible with all the previous Nvidia products (especially in data centers I’d imagine). Also to run CUDA. How does that sound?

Saul

The one thing that’s been nagging me a little bit about NVDA is the scenario where GPU technology becomes good enough such that future improvements are no longer demanded or necessary. I know where not near that now but the need for further improvement could be application dependent. If we were to reach the point where further GPU improvement is no longer a differentiating factor for customers for specific application then we could begin to the competition catch up and the GPUs to become commoditized. Thoughts on that?

Depends on what customers you are talking about. I could see it with gamers, but with industrial applications there’s no end in sight. Autonomous systems with real-time image recognition will need ever-better GPUs for decades to come. Think autonomous cars, even the semi-autonomous ones - every microsecond you can squeeze out of the hardware counts.

I agree that most Nvidia competition is on paper or in the PR departments of other companies. Because in fact there is no effective competition. For now. Lack of real competition tends to make the leader fat and lazy. Hopefully that won’t happen . And it does not have to, look at Intel, it was dominant in CPU 20 years ago , still is today.

further miniaturization will open up many more applications for smart devices at the edge.

vey important because Nvidia CEO says that is where the big future market will be. Millions of smaller less expensive chips compare to the fewer more expensive ones today. Nvidia may furnish both.

This is all new enough that it will be years before commoditization occurs .If it does. And if you look at top main street markets we might see little growth in sales but a dividend yield of 20% or more based in today’s prices.

2 Likes

According to a report on Seekkng Alpha and I went over some commentary in regard, so take it for what it is worth, AMDs new chip is built on a much larger wafer and uses more expensive memory and is likely to be a low margin product due to lower yields and higher memory cost.

Currently AMD gross margins are half of NVDIA’s. Nvidia, assuming the above is true on the new chip, could simply cut the price of its existing 1080 chips by 10 or 20% and knock AMD out of the market. Nvidia won’t do this for the same reason Intel won’t, at least the narrative goes, to maintain the veil of competition in the market.

Here is an example of dropping prices leading to an anticompetitive practice. No, won’t hold up on court but Nvidia like Intel have the power to still be very profitable and cut pric s to get AMD out of the market if they wanted to.

Tinker

2 Likes