• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

I have a question regarding RTX 4070 super being a 1440P card...

basically, I was having an argument with some of my buddies saying, that since I have the 4070 super, that card should (or Must) be run on a 1440P monitor and not on a 1080P because if I try to use it mostly at 1080 native, I will not be able to utilize it fully and that my CPU will be the one that will struggle a lot (my CPU is 7600x by the way) yeah I know, not the best cpu..

but anyway, back to the main topic, they said, in the long run, you are basically not using your GPU at its fullest, even when you run a new AAA game at max settings and raytracing ON, being on 1080P still wont utilize your card and you will get way lower frame rate unless you increase your resolution

so... here's where I don't get, and I want to clear it up, they say (increase your resolution???) and apply DLSS and frame generation

shouldn't that make my game ran way worse than say 1080p? (will I achieve a decent frame rate if I just set it to native 1080P?)
 
The end resolution should match your monitor's resolution. Whether you just play it at that resolution, play it at lower resolution with DLSS pushing it high, or even playing it at higher resolution with DRS scaling it lower, just be sure it ends up matching the monitor.

I'm not sure what utilizing your card has to do with anything. If you can get the game to run great while only using 15% of the power, that's a GOOD thing. Your electricity bill will be lower.

My recommendation would be to turn off the thing that's showing you your framerate/utilization and whatnot, play around with the options for a while, find something that seems pretty good, then get to playing the game.
 
What they said.

Why care what your friends say if you're content? If anything, playing at 1080p will extend the life of the system, assuming games continue to come out and push graphical horsepower. Your system is going to feel slower in 5 years at 1440p than it would at 1080p; one of the many reasons I still continue to stick with 1080p in 2026.

If I'm trying to push 1440p and having to tweak graphics settings, I'm only going to want to upgrade my machine sooner when performance starts feeling lacking.
 
basically, I was having an argument with some of my buddies saying, that since I have the 4070 super, that card should (or Must) be run on a 1440P monitor and not on a 1080P because if I try to use it mostly at 1080 native, I will not be able to utilize it fully and that my CPU will be the one that will struggle a lot (my CPU is 7600x by the way) yeah I know, not the best cpu..

but anyway, back to the main topic, they said, in the long run, you are basically not using your GPU at its fullest, even when you run a new AAA game at max settings and raytracing ON, being on 1080P still wont utilize your card and you will get way lower frame rate unless you increase your resolution

so... here's where I don't get, and I want to clear it up, they say (increase your resolution???) and apply DLSS and frame generation

shouldn't that make my game ran way worse than say 1080p? (will I achieve a decent frame rate if I just set it to native 1080P?)
Just for the record, you can still use DLSS and framegen at 1080p. Maybe I was misunderstanding what your friends were saying, but all of that still works at 1080p
 
Downscaling a high resolution image to a lower is way to avoid using AA

Nvidia DSR (Dynamic Super Resolution) is a driver-level supersampling technology that renders games at higher resolutions (up to 4K) and intelligently downscales them to your monitor's native resolution, resulting in significantly sharper, smoother graphics. It acts as a superior anti-aliasing method,

shouldn't that make my game ran way worse than say 1080p? (will I achieve a decent frame rate if I just set it to native 1080P?)
depends how much extra the GPU has sitting around not being used. If you can already run max, you may not see any difference.

Best way is to experiment and see if you can notice any difference.

I don't necessarily agree with sentiment that a certain GPU should match a certain screen size... to a point. Low end gpu maybe shouldn't be attached to 4k but no reason any others can't be attached to whatever. Its up to user, and what they willing to put up with. If I did I wouldn't be waiting on a 4k monitor as my current GPU should only be on 2k...

or even playing it at higher resolution with DRS scaling it lower, just be sure it ends up matching the monitor.
did you mean Dynamic Resolution Scaling or Dynamic Super Resolution, as Nvidia has both and it even confuses the AI.
DRS scales resolution to match a certain frame rate
DSR Scales the resolution up and then downscales

I blame Nvidia
 
Last edited:

TRENDING THREADS

Latest posts

Back
Top