banner



Nvidia DLDSR tested: better visuals and better performance than DSR

Nvidia DLDSR tested: amend visuals and better performance than DSR

How Deep Learning Dynamic Super Resolution looks and runs on RTX graphics cards

An Nvidia GeForce RTX 3090 Founders Edition, installed in a PC.

Every bit if ray tracing and DLSS weren't big enough bonuses to owning a GeForce RTX graphics card, Nvidia has just dropped some other toy in the chest: Deep Learning Dynamic Super Resolution, or DLDSR. Information technology's essentially an AI-fuelled upgrade to Nvidia'south DSR downsampling tool, aiming to more than intelligently render the frames of your games so that they appear more detailed – without the aforementioned performance loss that comes with standard DSR. It'south an intriguing new characteristic that could brand some of the all-time graphics cards fifty-fifty meliorate, and I've been trying it out to see if it performs as effectively as Nvidia claims.

In games, downsampling is the practice of rendering an image at a higher resolution than the brandish could normally show, and so rescaling that image and then it will actually fit the screen's native resolution. This will tax your GPU harder, in the same way that rendering a native 4K image would take more horsepower than 1080p, but in substitution can brand games expect noticeably sharper.

For years, DSR has been making it easy to enable downsampling at the driver level, and now DLDSR looks to take the next stride: using the Tensor cores on RTX cards to provide the same fidelity enhancements, without losing equally many frames per 2nd.

For reference, I tested DLDSR on a GeForce RTX 3070, alongside an Intel Cadre i5-11600K and 16GB of RAM. Just to reiterate, you lot volition demand an RTX-branded GPU to utilise DLDSR (unlike DSR, which works on all recent and semi-recent Nvidia GPUs), though since information technology operates at the driver level it should work with most games out of the box. Out of the .exe. Whatever.

Since DLDSR launched in an Nvidia Game Prepare Commuter with God of War optimisations, I decided to focus my testing on the adventures of ol' Grumpy Bristles himself. And straight abroad, it confirmed i similarity between DSR and DLDSR: neither are particularly fond of borderless windowed style. At that place's an like shooting fish in a barrel workaround for this, though: once DLDSR is enabled in Nvidia Control Panel, yous can set your monitor to the higher return resolution in Windows Brandish settings. Just as with DSR, this should permit yous select that target resolution in whatever in-game settings, even if it'due south in borderless mode.

In announcing DLDSR, Nvidia cited a Prey comparison that put the new tech on equal performance footing equally native rendering, with standard DSR far backside. In God of War, this turned out half true: my 1080p, Ultimate-quality benchmark averaged 62fps on the DLDSR 2.25x setting, which is a considerable drop from my native resolution upshot of 87fps. The supposedly equivalent DSR 4x setting, even so, only produced 48fps, so DLDSR does indeed go easier on the GPU.

Visually, likewise, DLDSR 2.25x not only looks sharper than native, but to my eyes beats out DSR 4x as well. Take a peek at the comparison beneath: y'all tin can see how DSR tidies up details like the shape of the glowing chain links, or the grass just to the right of Kratos'southward unkempt jaw. But DLDSR improves this farther past adding definition to the Nordic wall blueprint on the left and sharpening upwardly the ground textures in general.

A God of War graphics comparison image showing DSR 4x on the left versus native rendering on the right.
Left: Ultra quality, DSR 4x. Right: Ultra quality, native 1920x1080
A God of War graphics comparison image showing DLDSR on the left versus native rendering on the right.
Left: Ultra quality, DLDSR 2.25x. Right: Ultra quality, native 1920x1080

What's impressive here is that DLDSR 2.25x, as you lot might have guessed from the numbers, isn't fifty-fifty rendering at such a high resolution as DSR 4x. With a native 1080p monitor, DLDSR 2.25x will render at 2880x1620, while DSR 4x volition get the full 4K at 3840x2160. The departure lies in the algorithm: DLDSR has enough motorcar learning smarts that it doesn't demand equally many input pixels to produce a similar, or in this instance slightly amend, concluding image. And since fewer pixels need to exist pushed, overall performance is college. Is information technology a nighttime-and-solar day difference in visual quality between this and DSR? Not really – I'm certain you noticed those comparison shots needed to exist zoomed in to highlight the changes – but better looks plus better performance equals a clear upgrade for RTX users.

There'southward also i other add-on that can soften the frame rate blow even farther: DLSS. Now, this won't exist available in every game, simply you lot tin can use both it and DLDSR simultaneously. And while the mathematics of upscaling and downsampling each frame are a chip much for my humanities graduate encephalon, I do know that DLSS on its Balanced setting helped get 79fps out of DLDSR 2.25x. That's most 20fps more when using the default anti-aliasing, and less than 10fps below the all-native operation.

A God of War graphics comparison image showing DLDSR on the left versus a combination of DLDSR and DLSS on the right.
Left: Ultra quality, DLDSR two.25x. Right: Ultra quality, DLDSR 2.25x, DLSS Balanced

Sticking with a PlayStation port theme, I too tried DLDSR out on Horizon Naught Dawn, which simply listed the higher resolution in its brandish menu without whatsoever Windows setting tweakery. Compared to stock 1080p, at which the built-in benchmark averaged 109fps on Ultimate quality, DSR 4x produced a similar frame rate-tanking to God of War with 52fps. Switching to DLDSR was enough to bump this back up to 83fps, and calculation DLSS to the mix scored 104fps – only 5fps beneath native rendering.

These are skillful results from a speed standpoint, though I establish the fidelity differences even harder to notice than in God of War. In the comparison below yous tin make out a bear on of extra definition on the dusty ground texture, and mayhap a teensy fleck on parts of Aloy'south armour, but it's proper squinty stuff.

A Horizon Zero Dawn graphics comparison image showing DLSS/DLDSR on the left versus native rendering on the right.
Left: Ultimate quality, DLDSR ii.25x, DLSS Balanced. Correct: Ultimate quality, native 1080p

This does raise the question of whether, every bit a matter of course, enabling DLDSR is really the right play. I'd admit that while it works very well in God of War, I personally could live with leaving information technology off – especially when information technology would salve me having to tinker with Windows Display settings every fourth dimension I wanted to play it.

But so, I'one thousand near always more than inclined towards higher frame rates than having the crispest, sharpest, most high-tech visuals possible. If you are a fidelity nut, and you own an RTX graphics card, I actually would recommend giving it a effort. Particularly in games that support DLSS, as these will come up much closer to fulfilling Nvidia's promise of higher item at a lower performance cost.

Source: https://www.rockpapershotgun.com/nvidia-dldsr-tested-better-visuals-and-better-performance-than-dsr

Posted by: perrybeephe1978.blogspot.com

0 Response to "Nvidia DLDSR tested: better visuals and better performance than DSR"

Post a Comment

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel