Tech

How to Use Nvidia DLDSR to Improve Image Quality in Older Games

Nvidia first introduced Turing architecture Tensor Cores and the GeForce RTX family of graphics cards. One of the features that made it possible to use these cores was DLSS, mainly to reduce the performance hit when ray tracing was enabled without significant loss in image quality.

Not to be confused with DLSS, Nvidia’s latest addition to their arsenal of AI-enabled features is called DLDSR, which stands for Deep learning, dynamic ultra-high resolution.

What is DLDSR?

DLDSR is a supersampling technique that aims to improve on the old DSR (Dynamic Super Resolution). It displays supported games at a higher resolution than your monitor’s native resolution and then compresses the image to fit the monitor’s native output. This results in superior image quality with less flicker and better anti-aliasing.

DLSS vs. DLDSR: Upscale vs. Downscale

Both DLSS and DLDSR use Tensor cores in RTX GPUs, but each is trying to achieve something different. On the one hand, DLSS renders at a lower resolution than native to improve performance, and then upscales the image to the native monitor using tensor cores to help retain some, if not all, of the original image quality.

DLDSR does the opposite and downscales the higher resolution image, so it’s best used when you have excess GPU power and want to improve the spectacle.

This may be the case in older games, for example Star Craft, which are not very demanding and most often have mediocre anti-aliasing solutions. Another use case involves games that may not be the latest AAA games, but aren’t old either, that have great performance on a wide variety of GPUs combined with beautiful visuals such as Arcane’s LOOT or Resident Evil remake from Capcom series. Games like this are ideal because their slow nature means you pay attention to every detail, from texture quality to lighting and shadows, so DLDSR can help those graphical features stand out even more. You’ll likely want to use it on a 1080p monitor, as anything higher than that will render extremely high resolution, requiring a lot of GPU power as a result.

Also, it’s best used in single player games where framerate doesn’t matter as much as it does in multiplayer. For example, you can aim for 60 frames per second and use DLDSR to improve image quality as much as your system resources allow. In general, depending on your GPU and monitor combination, whenever there is significant headroom above 60fps, which is considered the basis for a smooth game, you can take advantage of that excess GPU power and translate it into DLDSR. Also note that DLDSR works in every game, while DLSS requires developers to implement a game-specific implementation.

Where is deep learning applied?

DSR and DLDSR are essentially the same up to the point where they are displayed at a higher resolution, but that’s where the similarities end. While DSR applies a “high quality (Gaussian) filter” according to Nvidia to make the higher resolution image fit on the monitor, DLDSR uses a kind of machine learning filter for this task. Thus, DLDSR manages to solve the fundamental problem of DSR – non-uniform scaling. To better explain this, we need to first take a look at how to enable DSR and DLDSR.

How to turn it on?

DSR can be used with the Nvidia GTX 700 series or later, while DLDSR requires a GeForce RTX GPU. Open the Nvidia Control Panel and follow the steps as shown below.

Here you will find various DSR ratios, which are essentially multiples of your monitor’s native resolution. Each of them represents a new resolution, which then needs to be applied in the game settings.

The first two factors use deep learning and are only available for RTX graphics cards, while the rest use the old DSR with up to 4x acceleration. Keep in mind that you can choose either the DL scaling factor or the corresponding legacy scaling factor, but not both.

Uneven scaling

Here’s the catch. DSR works best when rendering at a resolution that is an integer ratio of the native monitor. When using a 1080p 4x monitor, DSR factors are ideal because each pixel of the 1920×1080 grid is generated from information from a set of 4 pixels in a 4K DSR image. But this, as you can imagine, first of all requires a powerful enough GPU to be able to display 4K at a playable frame rate. Not using an integer factor for DSR causes image artifacts and poor edge smoothing.

To counter this, Nvidia has implemented a DSR-Smoothness slider (you can find it under DSR-Factors in the image above) to control the intensity of the Gaussian filter being used. Increasing the amount of smoothing makes these image artifacts less noticeable at the cost of a blurrier overall image.

On the other hand, the machine learning filter used by DLDSR does not suffer from the non-uniform scaling issue when using non-integer coefficients such as 1.78x and 2.25x, and also seems to handle anti-aliasing better.

Unfortunately, it’s not possible to demonstrate the final result of this with screenshots, as every method we’ve tried (Nvidia Ansel, in-game photographer mode, or just the old Windows print screen) captures the image before the filtering process is applied. This way every screenshot will look the same regardless of DSR or DLDSR. Feel free to try this for yourself if you’d like.

However, we can tell you that DLDSR already seems like the best option, but there is a small caveat.

The ML filter creates an overly sharp image, which may not be very important for everyone, but if you want, you can play with the smoothness slider to reduce the intensity of sharpening.

Performance comparison and visuals

To compare performance between various DSR and DLDSR factors, we used a Windows 10 machine, an RTX 2060 paired with a Ryzen 5600X and 16GB of DDR4 RAM running at 3200MT/s, with CAS 14 latency at XMP timings. Performance figures were obtained using the integrated Shadow of the Tomb Raider test with the maximum preset, and the results are as follows.

As you can see, there is a significant performance drop when using DSR and DLDSR compared to native 1080p, which is to be expected since the game is played at a much higher resolution.

But note the slight difference between DSR and DLDSR at the same render resolution, which can be explained by the processing overhead that Tensor cores introduce.

Now the fun part is Nvidia. demand that “the image quality of DLDSR 2.25X is comparable to that of DSR 4X, but with higher performance.” The performance part is definitely accurate, let’s see if the quality claim is true.

Above is a screenshot of Bethesda’s Prey taken from Nvidia’s marketing materials. Pay particular attention to the vertical and horizontal lines, such as the support wires of the ceiling elements in the background.

1620p DLDSR manages to recreate them much better than native 1080p resolution, and is also very close to 4K DSR representation with a significant performance boost.

Note that DLDSR 1080p and 1620p appear to be the same based on this, but this is most likely due to the fact that we are considering a CPU bottleneck scenario. We have seen that in a purely GPU situation this is not the case.

Sharing DLDSR and DLSS

When a game supports DLSS, it would be a good idea to use it along with DLDSR to offset some of the performance loss while still maintaining better image quality than the original.

DLDSR essentially forces DLSS to use a higher base resolution as DLSS uses a fraction of the game’s rendering resolution resulting in a cleaner image. You can also take advantage of the better anti-aliasing that DLSS sometimes has, especially in cases where the game’s TAA implementation isn’t as good.

DLDSR vs resolution scaling

Some games offer a resolution scaling or image scaling slider in the graphics settings menu as an additional anti-aliasing method. Basically, it allows you to define the percentage of your original resolution (eg 125% or 150%) at which the rendering will be done and then the image will be scaled down in the same way as we discussed above.

Resolution scaling is different from DLDSR as it is an in-game solution rather than something that works at the driver/hardware level, and it could potentially suffer from the same issues as DSR because it does not use a neural network to help with scaling filtering. .

In general, DLDSR is preferred, but depending on the game engine and how it’s implemented, image scaling may be better for a specific game, so you can play around with both and decide for yourself.

Possible problems

Depending on the game, you may encounter some issues when using DLDSR. In some cases, the image is scaled incorrectly and as a result the game screen becomes too large to fit on your monitor.

A workaround would be to change the scaling device from the default monitor to the GPU, and use the checkbox at the bottom to make sure all programs use that scaling mode.

The problem with this workaround is that it sometimes breaks other games that behave correctly with scaling performed on the monitor. There’s nothing you can do about it other than changing the zoom mode back and forth every time, depending on the game you’re playing.

Another issue you may encounter when using DLDSR in a variable refresh rate monitor is that the G-Sync feature stops working. You can fix this issue by going to the Nvidia Control Panel and enabling G-Sync for Windowed and Full Screen.

A little tip to make sure G-Sync is working properly is to turn on the frame rate counter on your monitor screen and make sure it changes depending on the frame rate in the game.

This shows the monitor’s real refresh rate, while tools like RTSS only report the GPU’s frame rate. Finally, DLDSR will make any kind of overlay (like RTSS, Nvidia Frameview, Steam overlay, etc.) smaller than using native resolution as they are also reduced. Keep in mind that your mileage may vary depending on the game you’re playing and the monitor you’re using, and you may or may not experience all of the issues mentioned above.

Nvidia DLDSR is another feature that modern GPUs offer to make your gaming experience more enjoyable. Considering the results of comparing ultra vs. high graphics presets, it may be preferable to set the settings lower than ultra and use excess GPU headroom to increase render resolution as this will likely result in more noticeable improvements in image quality. , clarity and smoothing. If you have a supported graphics card, give DLDSR a try and you might be impressed with the results.


Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button