RTX 50 series performance depends too much on Dlss

Dec 17, 2022
53
83
1,620
Visit site
Hello,

I heard that the new Nvidia RTX 50 series are way faster than the 40 series.

BUT it appears they didn't tell us they used a new version od DLSS to generate in between ai frames.

DLSS is nice but it also creates latency, and if Nvidia is going to keep on making their cards faster by improving dlss instead of creating extra real frames per second, that would be bad.

Do you guys agree? Look at this video.

View: https://www.youtube.com/watch?v=1MB6_Jz_Fkw
 
  • Like
Reactions: neogunhero
Features that make bad cards look good are a double edged sword. On one hand it lets people who can't afford the best to be near enough, but it also lets NVIDIA claim its top cards can do 240fps in games where if you turn off dlss only really get 60.
This and relying on upscaling are leading us down the wrong path.
This is what I was arguing in the PCG articles thread. DLSS is great and all, but the over reliance on it is making gamers tired. The problem is now that Nvidia says the days of “brute force rasterization” are over, the entire industry must follow their lead. AI upscaling is the future of graphics for better or for worse. DLSS is great for people in lower end cards to boost frames at the cost of visual clarity and sometimes input lag, but when you are selling a $2000 card and still are pushing those buyers to use DLSS, things get infuriating. I wish these companies will keep focusing on making real hardware innovations and not just rely on AI all the time. Same goes for games who post system requirements but in the footnotes says that DLSS must be turned on.
 
My semi educated take on this is that Nvidia have been struggling to make significant improvements in their architecture gen on gen for a while. We saw this with the 3000 series and the massive leap in power consumption between the 2080TI and the RTX 3090TI. It was (scribbles on napkin) 40% faster while using 70% more power. They had to increase the power envelope and so also quality of the coolers because if they stuck at the same levels they would have barely improved and reviews would have been bad. 4000 Series was a lot more efficient, but they also relied a bit on DLSS and in pure raster only a 25% jump from Flagship to Flagship in native frames.

For a long time they were able to rely on better smaller process nodes in chip manufacture to make up the gap in the design improvements. Smaller nodes mean better efficiency (lower power use) and allowed them to just cram more Stream processors/CUDA cores on the chips every generation and increase clock speeds. Now that the chip makers are struggling to reduce the size of transistors significantly (due to the laws of physics) those gains are less and less, they realized this a while ago and tried to make up more of the improvement gap with software.

Then we got DLSS and Frame Gen.

I honestly believe if they stuck to pure raster and didnt try any other software shenanigans we would either be waiting 5 years between generations or have to put up with relatively minor gains (compared to what we've been used to) every 2-3 years.

As far as the 5000 series goes, we havent seen proper reviews yet. From my perspective for single player action games 60FPS latency isnt bad at all. Maybe in something fast paced like Doom it may be noticeable to very skilled players but in your Assassins Creeds or Elden Rings I bet you wouldnt notice. Any Esports games dont have any need for high fidelity and I dont think it will matter there anyway. People shouldnt need to frame gen CSGO or Overwatch or whatever.
 
Last edited:
My semi educated take on this is that Nvidia have been struggling to make significant improvements in their architecture gen on gen for a while.
This makes a lot of sense. It must be impossible to go any smaller than 4nm on those chips. I can only hope for some miracle innovation but realistically it won’t happen, at least for a while. Perhaps quantum computing when it becomes viable to use on a consumer level in, I dunno, decades maybe, will be the true change in hardware we want to see.
 
This makes a lot of sense. It must be impossible to go any smaller than 4nm on those chips. I can only hope for some miracle innovation but realistically it won’t happen, at least for a while. Perhaps quantum computing when it becomes viable to use on a consumer level in, I dunno, decades maybe, will be the true change in hardware we want to see.
Theres all kinds of crazy business with different types of new transistors and so on so I dont think its all just about size. Its just that for a long time it went so fast and easily they made huge leaps every year or two, a lot of which was just by shrinking things, and now that has slowed down to a crawl.

I'm not arguing for or against DLSS 4 because I want to see for myself. But I dont think its just a case of them taking the easy route while the lazy Nvidia chip architects just phone it in as some seem to think. If that were the case I would think Intel and AMD would be catching up and going past Nvidia.

DLSS worked OKish for me in Alan Wake 2, and Ive used FSR in a few games and it looked OK although as I said elsewhere TAA seemed blurry to me in Baldurs Gate 3 and AW2. As it stands I'd rather not have to use it, but then again I thought CRTs were way better than LCD screens and I got used to that eventually as things improved.
 
I have zero experience with DLSS, but I've used the AMD version just a bit (FSR?) and I find it a bit crap. At least, crap enough that I turn it off and I'll accept lower settings for higher FPS rather than turn on FSR; image quality seems to take a huge dip and everything just looks kind of fuzzy and not right.

I've heard DLSS is better, but again, no experience there. Still using predictive AI or whatever it is to fake frames just seems weird. I suppose this is all fine, as the last card I bought was a 970 GTX in 2015/16 before my 6700 XT in 2023. I don't need the horsepower, I don't need the new card, so hopefully by the time we get over this hump it won't be an issue anymore.
 
I have zero experience with DLSS, but I've used the AMD version just a bit (FSR?) and I find it a bit crap. At least, crap enough that I turn it off and I'll accept lower settings for higher FPS rather than turn on FSR; image quality seems to take a huge dip and everything just looks kind of fuzzy and not right.

I've heard DLSS is better, but again, no experience there. Still using predictive AI or whatever it is to fake frames just seems weird. I suppose this is all fine, as the last card I bought was a 970 GTX in 2015/16 before my 6700 XT in 2023. I don't need the horsepower, I don't need the new card, so hopefully by the time we get over this hump it won't be an issue anymore.
I used FSR in Dead Space and God of War and it was actually pretty good. With FSR it seems to depend on how the developer implements it, maybe. As I understand it FSR is an algorithm and doesnt use AI to infer anything, thats part of why it isnt as good as DLSS, but also why its hardware agnostic.
 
I used FSR in Dead Space and God of War and it was actually pretty good. With FSR it seems to depend on how the developer implements it, maybe. As I understand it FSR is an algorithm and doesnt use AI to infer anything, thats part of why it isnt as good, but also why its hardware agnostic.
I'll have to try it on something else. I have (irregularly) been playing Dead Space with my old 1070, but haven't tried FSR on it. I could theoretically play it in 4k if it works well, but I guess it'll take some testing to see if it's actually any better than I've seen it in other games.
 
Dec 17, 2022
53
83
1,620
Visit site
Someone said something about transistors bumping on technical size limits. That current transistors are 4 nm.

But at imec Belgium they are capable of making transistors of 0.2 nm. They are still in experimental fase. So would that be the reason that Nvidia uses ai?

Maybe shrinking the transistors is getting too expensive and takes too long?
 
Someone said something about transistors bumping on technical size limits. That current transistors are 4 nm.

But at imec Belgium they are capable of making transistors of 0.2 nm. They are still in experimental fase. So would that be the reason that Nvidia uses ai?
Again I'm no expert but AFAIK Nvidia like AMD and recently Intel dont make their own chips. Theres a company called TSMC that designs the process and the other companies basically lease time at their factory. There used to be one called Global Foundries who were formerly part of AMD, no idea what happened to them. Intel also makes chips but theyve struggled to keep up with TSMC and so over the last few years havent been able to make their own because their factories are far behind the latest and best.

Maybe shrinking the transistors is getting too expensive and takes too long?
As I understand it yes. Thats a large part of why they cant just make clock speeds faster and put more stuff in the same size area chips.Its not the expense, its really hard to get working consistently the smaller you go.

Its been happening with CPUs as well for even longer. During the 2010s Intel were struggling to make their next gen chips more than 10% faster than the last gen for almost 10 years. Dont ask me to explain exactly why, because I cant. :)
 
Last edited:
Another thing I heard about upscalers like DLSS and FSR is that if you don’t get decent frames to begin with, it works a lot worse. Not sure how true that is, as it did help me a bit in Stalker 2 when without it my frames where hovering around 20-30fps on Low. Using DLSS did improve frames but visual clarity was turned to mush. I tried Frame Gen but on my RTX 2060 it increased input lag to the point of unplayable. So it doesn’t seem to be the godsend for all gamers Nvidia makes it out to be, you still need to have decent frames in the game you want to play. I think some people mistake it as a miracle option to boost their game from 20fps to 120+ with no cost to visual fidelity.
 
Another thing I heard about upscalers like DLSS and FSR is that if you don’t get decent frames to begin with, it works a lot worse. Not sure how true that is, as it did help me a bit in Stalker 2 when without it my frames where hovering around 20-30fps on Low. Using DLSS did improve frames but visual clarity was turned to mush. I tried Frame Gen but on my RTX 2060 it increased input lag to the point of unplayable. So it doesn’t seem to be the godsend for all gamers Nvidia makes it out to be, you still need to have decent frames in the game you want to play. I think some people mistake it as a miracle option to boost their game from 20fps to 120+ with no cost to visual fidelity.
Thats Frame Generation. It was generally recommended you get to at least 60 FPS without frame gen or the latency is bad, at least with the 4000 series.

DLSS, Intels XESS or FSR all render the game at a lower resolution than the display in use and clean up the image, obviously this means you get a faster frame rate, its like running the game at a lower resolution. After that happens you can still apply frame gen to get even more frames, I think you could use frame gen without the upscaling, but Ive not tried it.
 
No, i dont agree but im not trying to fan-boy for DLSS. Personally i see nothing wrong with it, if my card can use the tech, im gonna use it.

My only gripe is, I just hope this doesnt make developers more lazy at optimizing their games if they know gpus nowadays can produce better frames with dlss/fsr etc. Not that it hasnt stopped games before, esp. ones who didnt use DLSS after it was first introduced, but this could lead to more of that.

with that said,

This is one of those arguments where you can have all the right youtubers stuff and links and information about how DLSS (and amds/intels version of it) is "fake-frames" and is terrible to use and claim "gamers are mad!" etc. but were all going to have to work with it because its absolutely going nowhere. Each card partakes in their own variant of frame-gen / dlss tech and if you want less frames, dont use it, its that simple. You have the choice to use it or not.

Ill wait when pc gamers stop buying nvidia gpu's and amd gpu's and intel gpu's to "fight back" because thats the ONLY way.

DLSS is nice but it also creates latency, and if Nvidia is going to keep on making their cards faster by improving dlss instead of creating extra real frames per second, that would be bad.

Lets not forget that AMD and Intel, the only other GPU makers, both use the same type of tech to generate more frames. So this isnt just an Nvidia thing.

Also, doesn't using DLSS mean that your cards need less energy and heat to produce higher frames? The 5090 looks nowhere near as big as any previous xx90 cards but will produce the most frames with an "efficient" energy-use ratio.

Obviously we have to see benchmarks, but if thats the case, that is a good argument to use this stuff for people that do care about that. It can be used as an argument to bring these FPS-producing tech to cases that are extremely compact like consoles/handhelds etc. so they too can get higher frame rates and not have their game system turn into a microwave.
 
Last edited:
Not owning a Nvidia card, I don't use DLSS at all. I don't play any games I need upscaling on, even if it is on by default in the drivers. Could be my GPU powerful enough to run most games without it.

4x fake frames running with the same reaction speed as 1x isn't the answer. It might look amazing on screen... but games are more than just movies. Reaction times need to still be the same. Or there will be a disconnect.

The reliance on upscaling and everyone jumping onto Unreal Engine 5 is another trap

Nvidia wants industry to follow its lead as its good for profits. If DLSS becomes a standard, they need to open it up and let other companies assist or its just a monopoly.
Them pushing AI is hardly a surprise... where is all their profits coming from? Just trying to capitalise on their investments.

Too many "easy way" options out there to expect people to actually keep pushing boundaries. So many incentives to embrace them to reduce production times.
 
Its not just dlss, it relies on the games having support for the latest version and multi frame generation, so having a card that can do 240 in newer games is great, but it won't make any of the older games play like you actually had a better GPU.


Its a smoke screen
 
The marketing is deceptive, but all the companies make bold cherry picked claims whenever they release a new card or CPU. They do it every time, wait for proper independent reviewers to put them to the test and make a judgement then.

Honestly though, its not even all that bold a claim to have the new xX70 card beating or equalling the previous flagship, that was what happened for the last 10-15 years up until 4000 series.
 
  • Like
Reactions: Zloth
I have lots of mean and hateful things to say after reading this thread, but I'm going to show uncharacteristic reserve. :ROFLMAO:

The main fear as I see it is that AAA developers will reduce the amount of optimization time/money they spend on their PC games, and then no one will be able to run them without DLSS 4. What should be a huge boon, especially in the low end part of the market, will end up a wash, and we'll still be struggling to run AAA games at acceptable frames.

@neogunhero you should be the happiest of all. This helps us low-enders (I spend a fortune, but laptops are always low end in performance) more than anyone. You can get a 5060 for cheap and run everything for years to come at 1080p (so long as the developers do their jobs).
 
but for most of that time the performance wasn't reliant on games supporting features and was more a result of the cards hardware being better.

As I said earlier a lot of it was due to significantly better manufacturing processes each time that wasnt 100% to do with design by Nvidia, AMD, Intel or whoever.

Without DLSS XESS or FSR we probably wont see those types of gains very often unless something seriously changes on the design side. Maybe AI designed chips doing things that human engineers couldnt think of before, maybe Intel or someone makes some huge breakthrough in GPU hardware dont know. There are incredibly smart people working on this because its worth a lot of money.
 
@neogunhero you should be the happiest of all. This helps us low-enders (I spend a fortune, but laptops are always low end in performance) more than anyone. You can get a 5060 for cheap and run everything for years to come at 1080p (so long as the developers do their jobs).
That’s what I’m hoping for at least. What you said about devs not prioritizing optimization is a real fear, because I think people should be given the choice if they want to use upscaling or not. Games and their devs should absolutely not have to rely on DLSS, etc., to have acceptable frame rates. However for lower budget gamers, it does benefit them as they don’t have to buy a $2000 card to get high frames in games. It should be an option of choice, not the default. As far as Nvidia goes, they will keep pushing it and improving on it because that is just where the hardware industry is headed.

Maybe they can make some other sort of AI program that helps make your PC actually run better instead of just games, and not like that crap you can find online (Free* PC Cleaner Optimizer Booster Pro Max (*$50/month after 30 day free trial)).
 
  • Like
Reactions: ZedClampet
I can't agree with this and especially the way NVIDIA is marketing it.

DLLS4 is a whole suite of different things. It's not one thing that does it all and the image quality and memory reduction from the upscaling applies to 40 series cards. The Reflex tech applies to 40 series cards as well.

It's only the multi frame generation that is specific to 50 series cards.

That's the, mainly, software part of it. The main hardware part for most of the cards is the use of GDDR7 which gives a much higher bandwidth.

They've still stuck with the same memory configurations, bar the 5090. They got so much bad press for the 4060 having 8gb and the 5060 is going to stick with it? Even the 5070 only having 12gb imho is not good.

I think they announced like 75 games with support for some or all of the new features? But they aren't the games that most people are playing. The steam survey showed most people played an average of just under five games in 2024, with a lot of them being older games.
 
Apart from hardware limitations, I also wonder how much of the decision to use software to increase frame rate is more about maximizing profits? I would expect that having to supply every card they sell with improved hardware/architecture isn't as cheap as a software solution, particularly with today's market conditions.

Another thing I heard about upscalers like DLSS and FSR is that if you don’t get decent frames to begin with, it works a lot worse.

You have to sync FPS to match the monitor's refresh rate, otherwise you do risk horrible lag. This is where G-Sync (essentially VRR) comes into play. I admittedly use DLSS as I love playing at 120Hz refresh. It's smooth as butter. But if don't match frames with the monitor's refresh, it's almost unplayble, and navigating menus is literally a drag.

Using DLSS did improve frames but visual clarity was turned to mush

That also depends on the setting, as there are several options that impact visuals, which essentially boil down to performance vs balanced vs quality, with "Performance" being noticeably pixelated.

Ill wait when pc gamers stop buying nvidia gpu's and amd gpu's and intel gpu's to "fight back" because thats the ONLY way.

I really want to support this. Even if I do use the tech, I'm also weary of Nvidia and the impact they're having on the industry. But dammit, I'm admittedly a sucker when it comes to visuals so I can't rule out that I'll front some $ on a 5070 Super later down the line.
 
  • Like
Reactions: neogunhero

TRENDING THREADS

Latest posts