Nvidia Image Scaling vs In-game resolution scale

Jan 14, 2020
18
32
4,550
Visit site
Saw that nVidia released this feature in a driver update recently and its apparently similar to AMD's FSR, but I don't understand exactly how either of these are different from the feature in many newer games allowing you to set an "internal resolution" that the game renders at and then displays at higher (likely native) resolution. Googling found plenty of articles comparing NIR to DLSS to FSR, but not to in-game engine options.
 
Preface by saying I had to look this up a bit, had an idea in the back of my head from reading stuff around but nothing exact. I'll get the ball rolling anyway. Anyone who sees anything wrong please correct me or expand, as always.


From the article.
......Conventional resolution scaling isn’t perfect and does have an image quality impact: you’re rendering fewer pixels. This is particularly noticeable when it comes to distant and high-frequency detail. Sharpening can help, but it’s not a magic bullet.

I believe that FSR/NIS upscale from a lower resolution while applying antialiasing (AA) to smooth any jaggies while also improving the look of textures so they don't look lower resolution. Whereas lowering the render resolution in the game menu ends up with worse looking textures when you upscale it, even if you can apply sharpening filters to smooth any aliasing at the edges.

DLSS is the best solution but only works on Nvidia RTX cards and takes advantage of AI somehow. The other two are different methods of achieving the same thing, that can work on cards without Tensor cores (The AI bit).

So as I understand it for image quality

DLSS>NIS/FSR>In game upscaling+sharpening.

Not sure which is better out of NIS or FSR yet, and of course there are quality/performance levels for each solution that affect the image and frame rate as well.
 
  • Like
Reactions: Brian Boru

TRENDING THREADS