Eurogamer has what I believe the only benchmark review with the DLSS technology through some demos. Still no RTX yet.
DLSS is the reason for the Tensor cores, it uses neural netoworks to do some graphical voodoo, you know to be technical. Nvidia has this sweet Saturn V super computer with hundreds of DGX-1 AI computers and thousands of Volta V100. I did a post a few months back called “how Nvidia does it” talking about it. Now they are putting that into action. A gaming software company can partner with Nvidia and Nvidia does basically all the work. Nvidia created a vast neural network in Saturn that can take a game and create millions of images with it and then run the DL neural network on it to deconstruct the images so they can be taken from a lower resolution setting to a higher resolution. Then another neural network is written (by Saturn) that can infer how to reconstruct the image to near or better than higher resolution(say 4K). This new neural network unique to that game can be containerized and provided as a game update or patch. A game running at lower settings can produce much faster image generation (frames per second) than one running at higher settings. The result is that DLSS produces equivalent quality at a higher refresh rate than natively could be produced straight to 4K. The time it takes to generate a lower quality frame reconstruct it and produce an AI enhanced equivalent frame of a higher resolution is less (much less) than it is to render that frame straight to the higher resolution.
This is why with your Turing enabled RTX with the patch enabled game the performance of the RTX GPUs really shines. On DLSS capable demos here’s what they found.
https://www.eurogamer.net/articles/digitalfoundry-2018-9-19-…
From the Final Fantasy 15 Demo:
RTX 2080 performance with standard TAA(native higher resolution reveals that the card enjoys a straight 30 per cent lead over GTX 1080, and it’s basically on par with the GTX 1080 Ti - a state of affairs that’s fairly common in the standard benchmarks to come. DLSS grants the RTX 2080 a further 39.5 per cent of raw performance, which clearly takes it well beyond the capabilities of even the most powerful Pascal cards. With DLSS active, the RTX 2080 offers an 81 per cent boost over GTX 1080. And once again we see that the RTX 2080 with DLSS enabled outperforms the 2080 Ti running on standard TAA. With RTX 2080 results this good, the impact on RTX 2080 Ti is even more profound. Again, with DLSS active, it’s capable of delivering 80 per cent more performance than the GTX 1080 Ti, which does not have access to this technology.
It’s important to note that these are only demos, not full gameplay. However, 28 games are already slated to release such patches for enabling DLSS. This will be coming sooner than later and there will be more benchmarking for DLSS. The first glance here looks really great. And since it’s all AI and DL, it will probably improve fairly quickly, because that’s one of the key things about AI is that it learns. Ray Tracing will probably also be pretty stunning but DLSS will be the first new technology that will show whether or not the Turing architecture will meet high expectations.
In this review they talk more about image quality of the DLSS, which basically concludes that it’s as good if not better than doing it straight to the higher resolution at a much better performance.
https://www.eurogamer.net/articles/digitalfoundry-2018-dlss-…
It will be interesting if the Tegra line gets a Turing type treatment. Ramping up the resolution capabilities for consoles like the Switch. Could other consoles want in on that?
Darth